AMD: Volcanic Islands R1100/1200 (8***/9*** series) Speculation/ Rumour Thread

My Sapphire R9 290 came today. Ran Bioshock Infinite for over 20 minutes and I can barely hear the fan. But when I alt-tab into the desktop to check fan speed & temp via CCC, the fan kicks up to 47% and the noise is fairly loud. But when I alt-tab back into the game, the fan slowly gets back low enough that I can barely hear it again.
 
AMD really gave themselves a headache with this cooler situation. Unsurprisingly, nVidia is actively "encouraging" reviewers to look for variability. Hardware.fr obliged and found at least one retail 290x to be ~7% slower than their press sample.

http://www.hardware.fr/articles/910-31/trop-variabilite.html

Of course this is only in quiet mode and is probably only an issue for reference cards. AMD's response was the same thing Dave mentioned in this thread - the PWM based fan control is causing the fan on some cards to spin slower than expected leading to higher temps and lower clocks.

s9c7.png
 
Last edited by a moderator:
But that's still a 3 orders of magnitudes larger than the microseconds' granularity to regulate voltage.

I'm trying to see how this typically would work:
- some unit gets a burst of activity
- this raises an alarm of potential localized critical temperature
- lower voltage by a few mV?
- lower clocks all across the die as well because of slower transistor switching speed. (I'm assuming that this will lower the voltage across the whole die.)

Modern thermal monitoring and regulation techniques utilize a variety of methods both in the analog and digital realm. The Radeon parts are probably likewise using a significant combination of analog and digital solutions.

There are lots of tricks that are played, not only for thermal monitoring and regulation but also local voltage droop control. Its not at all uncommon these days for execution units for instance to self throttle in order to reduce di/dt ramps. I'd wager that AMD also has some digital activity frequency measurements and throttle capabilities built into the hardware and linked with the drivers/software.

So the big take aways are..

Thermal monitoring:
- Voltage monitoring
- current monitoring
- local activity monitoring
- global activity monitoring
- localized thermal diodes
- global thermal diodes

Thermal regulation
- voltage regulation
- frequency regulation
- global activity throttle
- local activity throttle
- local frequency division (not a new frequency but temporary half clocking)
- global frequency division

and all of these can be done in hardware, software, or a combination and is likely variable.
 
Nvidia opened this can of worms with boost. Which was variable in its essence. So yes its only proper they are now the ones crying foul.
 
Nvidia opened this can of worms with boost. Which was variable in its essence. So yes its only proper they are now the ones crying foul.

They didn't open up a can of worms, they merely arrived at this type of technology which was always going to be on the natural progression curve of the tech a little earlier than AMD. AMD now seemed to have overtaken them in the sophistication of the implementation a little but the issue isn't with the technology itself, it's with how it's marketed.

There's no escaping the fact that the 290x being advertised as "up to 1Ghz" is totally and completely misleading.

By contrast NV's tactic of marketing both a guarenteed base speed and generally highly conservative typical boost speed is far more informative and less misleading to the consumer.

They are doing a service to us as consumers by using this method so of course they'll want to point that out to us. Attacking them for it is only proping up bad behavior in the IHV's and thus doing yourself as a consumer a disservice.

We'll all be losers if NV decide to follow AMD's example and market the 780Ti purely as "up to 1006Mhz" instead of it's actual base speed of 876Mhz and typical boost speed of 928Mhz.
 
Huh? it got a toggle button, in one setting it's generally running 1Ghz and in the other it's running less. I wouldn't call that "totally and completely misleading" then.
And for this particular segment I could imagine that the majority would rather have a low clocked card performing at a certain level than a higher clocked ("it's efficient/must have a lot of headroom then"), so I'm not sure it's even a good idea to market the max clock.

But like with the 680 and forward the review performance may be misleading compared to the end user experience (golden press samples, open test bench, short benchmark runs all showing higher performance). That's the can of worms, amd has just taken it a step (or two) further than nvidia so far. (while the more serious sites have slowly adapted to the new situation since it started)
 
I'm not sure this sort of technology "was always going to be on the natural progression curve of the tech". AMD was quite proud to tout the deterministic nature of PowerTune when it was first introduced, while NVIDIA's Boost was non-deterministic and, as a result, could usually coax a bit more performance out of most samples.

AMD reminded everyone that PowerTune was deterministic, but probably fewer than 0.01% of graphics card buyers (end-users of OEM machines included) understood the difference, and fewer still cared, so now PowerTune isn't deterministic anymore. In this sense, yes, NVIDIA opened a can of worms.
 
I've always find it amusing that coaxing more performance out of something is considered to be a bad thing.

Yeah, it's not deterministic. And, yes, that sucks for the few who find meaning in one GPU being a few % faster than the other. For the remaining 99% of the consumers, it simply means that they got a bit more performance.
 
It's good in that it's basically free overclocking for those who wouldn't have had it otherwise. I'm still not totally convinced it's where they should have gone though. It might be a few percent plus or negative now but where does it end?

Another thing I wonder is if this is going forward to future nodes, as it would seem logical to assume, how is it going to behave on much newer silicon? Is this current turbo we're seeing more a bonus of the node maturity or what?
 
New nodes are not going to change the fact that GPUs can't run at max clocks at all time without ever exceeding thermal limits.

IOW: the power wall will not go away.
 
I'm not sure this sort of technology "was always going to be on the natural progression curve of the tech".

Of course it was, and that's why Intel introduced turbo. Power consumption is now the primary limiting factor. In order to maximize performance in diverse workloads you need flexibility. This is a good thing.

AMD reminded everyone that PowerTune was deterministic

Pure marketing because they didn't have their own turbo solution yet. Now that we see their implementation it's obvious they don't value determinism that much.

In this sense, yes, NVIDIA opened a can of worms.

No, AMD did that with their "up to 1Ghz" nonsense. They rolled out a very confusing and half-baked implementation of turbo. Instead they should've spent a few dollars on a better cooler and dropped this silly notion of a quiet mode that falls apart at the slightest nudge. Nobody seems to be complaining about a missing quiet mode on the 290.

It's the implementation that's bad, not the underlying concept.

They need to advertise minimum clock speeds like Intel/nVidia and build cards with higher tolerances. I'm sure they'll do a better job next time.
 
Huh? it got a toggle button, in one setting it's generally running 1Ghz and in the other it's running less. I wouldn't call that "totally and completely misleading" then.

You know what, I've been back and read through the reviews and you're correct. I'd been so focussed on the stock "quiet" mode details that I'd completely missed that "uber" mode generally maintains 1000Mhz at all times. That's actually a much better picture than I'd realised.

Additionally it looks like the 290 maintains its 947Mhz top clock most of the time at the new 47% fan speed.

So I guess this marketing isn't all that misleading then although it does come at the expense of two very loud cards (ignoring the 290x's quiet mode because that really does add a whole load of confusion). Clearly both cards desperately need better coolers. The 3rd party card reviews should be interesting.
 
I've always find it amusing that coaxing more performance out of something is considered to be a bad thing.

Yeah, it's not deterministic. And, yes, that sucks for the few who find meaning in one GPU being a few % faster than the other. For the remaining 99% of the consumers, it simply means that they got a bit more performance.

Just to be clear, I'm not saying it's a bad thing per se, just that it makes drawing conclusions from reviews harder, especially when competing products are very close, which is quite common with GPUs.

Of course it was, and that's why Intel introduced turbo. Power consumption is now the primary limiting factor. In order to maximize performance in diverse workloads you need flexibility. This is a good thing.



Pure marketing because they didn't have their own turbo solution yet. Now that we see their implementation it's obvious they don't value determinism that much.



No, AMD did that with their "up to 1Ghz" nonsense. They rolled out a very confusing and half-baked implementation of turbo. Instead they should've spent a few dollars on a better cooler and dropped this silly notion of a quiet mode that falls apart at the slightest nudge. Nobody seems to be complaining about a missing quiet mode on the 290.

It's the implementation that's bad, not the underlying concept.

They need to advertise minimum clock speeds like Intel/nVidia and build cards with higher tolerances. I'm sure they'll do a better job next time.

The cooler sucks but apart from that I fail to see how AMD's implementation of Turbo is "half-baked". If anything it's far more sophisticated that NVIDIA's. It's just marketed differently, and perhaps poorly, but that's a different matter.
 
Last edited by a moderator:
A small update added to Techspot's R9 R290 review:

Update: Based on your feedback, I took the IceQ X2 cooler off the HIS Radeon R9 280X and stuck it on our R9 290 sample. Cooling was dramatically improved. The FurMark stress test maxed out at 76 degrees while the card never exceeded 63 degrees in Crysis 3 and Battlefield 4. So it seems as expected the board partners will be able to solve the heat issues of the reference card.
http://www.techspot.com/review/736-amd-radeon-r9-290/page8.html
 
I've been wondering how much the power usage will go down with the temperature for the 290(x), got some number for the 290 here http://ht4u.net/reviews/2013/r9_290_im_griff_prolimatech_mk-26_black_im_test/index7.php
MK26 58C - 260W
MK26 silent 64C - 265W
Stock 94C - 289W
The ratings are all quite high, so I'm not 100% sure the stock card isn't throttleing at it's 47% (however it's the same wattage that they get in the main review where it says "3d load (games)" and not (max) like here). On the other hand the MK26 fans are likely not driven by the card, chopping some watts (5?) off the stock's number.

Still it seems like below 1W per degree, I would have thought closer to double of that.

There also this one (last graph) showing just 4% overall difference between 290 and 290x at the same (fixed) clock.
 
Pure marketing because they didn't have their own turbo solution yet. Now that we see their implementation it's obvious they don't value determinism that much.
Not correct. The previous version of PowerTune had many of the same parameters that this one does, we chose to peg it as deterministic, but as pointed out that wasn't particularly well recieved so we've gone the other way. The PowerTune algorithm actually effectively has a dial in it where I can should between the "level" of determinism and the level of non-determinism, for R9 290 Series that dial is turned to 100% non-deterministic. There may be a case to be made that whats implemented right now is too sophisticated and reacts too quickly / sensitively and we'll see if we alter the position of the dial in future products, but the net net of that will be slightly lowered absolute performance.
 
Back
Top