AMD: Volcanic Islands R1100/1200 (8***/9*** series) Speculation/ Rumour Thread

wonder when we get to see what mantle did to the performance gap in bf4

Ahem
http://forum.beyond3d.com/showpost.php?p=1797414&postcount=427

sI329jq.jpg
iB9zn2Q.jpg
 
Mantle is being "announced" at the end of November, with benchies in BF4 in early December. Or so was the schedule a few weeks ago...
 
On matched (or same timed) panels that should be down.

But why can nvidia do it anyway, and seemingly without complains about flickering etc?

Mantle is being "announced" at the end of November, with benchies in BF4 in early December. Or so was the schedule a few weeks ago...

With planned release in december they should be able to show some performance figures at the summit in mid-november. At least some not too specified, developer style figures.
 
Personally,I'd like to congratulate AMD on a job well done both in terms of efficiency, and in terms of market positioning. The top end of discrete gfx-cards definitely has taken a step towards saner levels.
I hope they get rewarded by reasonable sales volumes even late into the 28nm life cycle.
 
The card is great value.

The reference cooler is underwhelming and the power consumption when you overclock is awful.

A custom cooler will solve the noise/temp issues, but it appears that 28nm has been stretched to its limits.
 
From what I saw in the reviews, I think the overclocked, custom-cooled, don't-care-how-much-it-consumes 3rd party cards will really show what the chip can do and put it heads above the Titan.
I do wonder if increasing the MSRP by $15 for a more expensive cooling solution wouldn't be preferrable for a flagship product, though.

Regardless, congratulations to AMD for this release. It's certainly a success.

It should kill nVidia's ridiculous margins on the Titan and 780GTX.
 
I agree.

In far cry 3 there is a +10% with a 100% fan

http://www.hardocp.com/article/2013/10/23/amd_radeon_r9_290x_video_card_review/13#.UmjkQRAVR_R

AMD should definitely start delivering better reference coolers, like Nvidia


Did the fan actually run at 100 percent?

HardOCP said:
100% Fan vs. Uber vs. Quiet

Now we are going to manually enable a maximum fan cap of 100%. Remember, the fan will not run at 100%, all this is doing is "unlocking" the fan to be able to run up to 100% if the controller so chooses to while gaming.
 
Very good pricing on the card and that means that the regular 290 will also be a nicely priced card. The performance is also very good, especially with a good airflow. 3rd party coolers and water cooling should really make this card shine. AMD has also mostly fixed the Eyefinity Crossfire issues with their new driver. Hopefully this fix will also trickle down to their other cards with the old Crossfire connector. I just got myself 2 7990s for a good price, but now I'm thinking that 3 290s (regular) might have been a better spot. If the frame pacing fix doesn't come to my cards, I'll just shoot myself with a quantum torpedo.
 
Great card aside from the reference cooler. I'm almost tempted to pick one up, but I think I'll wait a year for the next generation on 20nm. That said, I expect some nice open air and even triple slot coolers to really make this an amazing GPU. I'm kinda wishing AMD spent a little more on the cooler in the first place for the sake of CF users, blowers are important for multi-GPU, I haven't seen a good aftermarket one offered in years.
 
protip, overclockers dont give a shit about power consumption*



*unless it exceeds there power supplies ability to supply stable current/voltage.

They do when it can destroy the cards that much quicker.




To get 65mhz OC versus stock voltage, wizard increase to just a touch over 1.25volts caused power consumption to go up 250watts and made his system consume from 400watts to 650 watts in a test that only stresses the GPU. Just imagine the power going through the card at 1.4 ghz plus and more volts.

In comparison, a gtx 780 lightning with 1.3 volts(over volted by 0.15 volts) running with a 20% overclock over stock lightning speed(30% over stock gtx 780 speed) consumed 400 watts in the same test.

I don't think this architecture is meant to be overvolted. It might even be dangerous with water cooling or non-stock cooling.

E.g 95C is a safety precaution that kicks in to tell the card to lower clocks and increase fan speed. What happen if a person carelessly increases clocks and voltage and the 95C mechanism doesn't kick in with water cooling. Wizard didn't go beyond 1.26 volts, but we see some people putting 1.35-1.4 volts on 7970's. If someone did that on a 290x, I think it could use up power beyond what even water was meant for and test the electrical components of the cards to the limits. A lot of people with LN2, add on stuff to increase the electrical capacity of cards. But this cards seems to drink power like LN2 even under air.

Your card might be consume 700-800 watts, but nothing would be stopping your computer from throttling down as that 95C would not be being reached. Although power consumption might drop a bit with cooler temperatures, it doesn't change that nothing scales as fast with voltage as 290x.
 
Last edited by a moderator:
But now add a decent Cooler, mature drivers, Mantle and a OC version with more TDP and you get a card, that could crush a Titan by 30-50% in BF4 !!! That is fantastic considering the price.
 
$549? Nice move AMD. They should've invested in better cooling though. The high temp target is interesting - wonder if that was planned from the start or done out of necessity.

I know some guys were concerned about the variability in performance when nVidia introduced boost clocks. This should be an even wilder ride depending on your case and ambient temps. The upside is that temperature is something you can control to an extent so you can pick the noise/perf/temp levels that you want.
 
I think I got my answer as to why they didn't use their fast MCs: the additional power consumption would make it useless. Pity about the noise/power, it's otherwise a very nice chip for a great price.

At the same time, it's remarkable to see how Nvidia is now crushing AMD wrt power efficiency at the high end. The perf/mm2 free for alls of the past are gone. They better have an answer to that for the next generation.
 
Back
Top