AMD: Speculation, Rumors, and Discussion (Archive)

Status
Not open for further replies.
Not really if he was praising the 1/2 DP rate.

Indeed Haiti has currently the highest DP performance of 2.5 TFlop/s for a single GPU.
If Fiji has similar capable GCNs, a pro version could reach 4 TFlop/s DP
(matching Pascal which is 16 nm vs 28 nm Fiji)
 
a) $169
b) 169mm^2
c) 169TFLOPs
d) 169 man years R&D etc
e) 16.9nm
f) 16:9
g) 169 sales projection
h) 16 September 2016

Or it's just the Gemini graphics card with two Fiji GPUs at 1030MHz, which would result in 16.9 TFLOP/s.
And 16.9 TFLOPs is over 2.5x the theoretical compute power of nVidia's current flagship, the Titan X - which is probably the price range that the new card will target.


Fun fact: Polaris Gemini is the name of a Snowmobile.


EDIT: Which means this shouldn't be in the Arctic Islands thread.
 
So similar to enthusiast GPU comparisons involving AMD 295x2 and Nvidia single GPU's.

In this case we are not speaking about gpu performance, but of the peak DP rate of their respective architecture ... If Nvidia need 2 gpus ( or 2 cores on 1 pcb ) for match the AMD S9170 DP performance, this should say all... ( 2.62Tflops DP, 32GB of memory for the S9170. )..

Anyway, the K80 is an impressive stuff of hardware..
 
Last edited:
Polaris is above North Pole, where the Arctic happens to be.

Well then, to me that means Fiji will be rebranded into 2016's GPU family. Maybe with HBM2 if the chip supports it.
 
err they better not, competition wise, it won't be good, power consumption of Fiji based chips won't fit well in the mid range segment....

I think they will have two new chips one to fill out the high end to high midrange and a smaller chip to fill out the mid range to high low end. A third chip which will come out in 2017 for low end.
 
Considering performance/watt will go up a lot just because of the switch to 3D trannies, one can only hope there's more to be had through actual bona fide architectural improvements, or AMD might be in a tight spot versus Nvidia, come next gen.

Personally I don't mind that much if a fast GPU belches heat as long as it's fast, but a cooler running GPU will of course have a thermal headroom advantage that allows it to bring more performance to the table.

We'll see what happens. The future isn't here yet - sadly... :(
 
Considering performance/watt will go up a lot just because of the switch to 3D trannies, one can only hope there's more to be had through actual bona fide architectural improvements, or AMD might be in a tight spot versus Nvidia, come next gen.

Personally I don't mind that much if a fast GPU belches heat as long as it's fast, but a cooler running GPU will of course have a thermal headroom advantage that allows it to bring more performance to the table.

We'll see what happens. The future isn't here yet - sadly... :(

AMD already puts out less heat than Nvidia cards do. AMD is currently voltage limited, as well as the fact that the Fury X isn't a very well balanced design. It might beat the Titan X if it was, but it isn't. As for the process change, there's distinct tradeoffs between clock speed, power efficiency, and even transistor density. Only AMD knows what balance between all that it's striking, and thus how much the architecture plays into that "2 times perf/watt" thing.
 
AMD already puts out less heat than Nvidia cards do.
Are you sure? From TechPowerUp's figures, both R390X and Rage Fury X pulls more juice than the 980 and 980Ti, while being slower in gaming than the NV high-end part*. But maybe you have facts that dispute this? :)

*At 1440P. At 2160P, Rage Fury X has a very slight advantage over 980Ti, but *cough* who games at 4K anyway with today's cards?
 
Without context, "Heat" is really meaningless. What matters more perf\W.

What does that even mean?

He's the one that wanted to know about heat, so, whatever. Per performance AMD cards put out less heat than Nvidia cards, I mean... that's probably not that important for most people unless your running a huge datacenter with expensive cooling. But then performance per watt isn't that important unless you're doing the same thing and people go nuts over that for no reason.

You also know perfectly well what voltage limited means, or rather power draw limited if you really want. Power draw doesn't scale linearly with performance, a Fury X draws at 300 watts, about the highest a GPU can reasonably go while a Titan X draws 250 watts.

And yes, I'm absolutely certain AMD cards get less hot than Nvidia ones, unless AMD has coolers that magically work better under full load while being quieter than Nvidia's at the same time, an aircooled Fury, which is solidly faster than a 980 (non ti) is still several degrees cooler as well http://www.anandtech.com/show/9421/the-amd-radeon-r9-fury-review-feat-sapphire-asus/17
 
A Fury X draws at 300 watts, about the highest a GPU can reasonably go while a Titan X draws 250 watts.

And yes, I'm absolutely certain AMD cards get less hot than Nvidia ones, unless AMD has coolers that magically work better under full load while being quieter than Nvidia's at the same time, an aircooled Fury, which is solidly faster than a 980 (non ti) is still several degrees cooler as well http://www.anandtech.com/show/9421/the-amd-radeon-r9-fury-review-feat-sapphire-asus/17

Which one do you think heats up the room more. The card that draw 300W or the one that draws 250W? Are you REALLY comparing a custom 3rd party cooler to the nVidia reference blower cooler and making claims about that "AMD cards put less heat" based on that!?
 
Status
Not open for further replies.
Back
Top