AMD: Speculation, Rumors, and Discussion (Archive)

Status
Not open for further replies.
AMD is also focusing on one card per eye it seems. So 2 480s at $400 or 8 gigs at $450 / 460 would be a very nice vr feature if games took advantage.
 
None. Let's hope we get something more exciting when NDA lifts.


Yes.
Meh then.It seems both Nvidia and AMD only got the improvements inherent to the new smaller finfet process.The thing is AMD needed a real architecture improvement.
As someone has said this 200 dollars card is the logical improvement in perf/dollar we have had always with new nodes, nothing extraordinary.
The thing is we have been stuck so much with the 28nm node that a 980 has become like a mytical expensive card (and Maxwell was in fact the biggest only architectural attributable jump in perf/wat ever, along with R300 and G80) that seemed imposible to be bought at decent prices.So being able now to buy a RX 480 with perf of a 970 seems incredible.
 
Last edited:
Two things:
First, what we don't know yet is the actual power consumption of the RX 480. One 6-pin connector only means >75 watts. It still could be VERY low.
Second, AotS is a showcase benchmark for AMD, yes. And here, we will probably see the least performance increase (uplift? ;)), because honestly, we expect to see improvements to the weak points of the architecture and AotS mostly shows off the strong ones. So, perf slightly above Hawaii XT could be the lower end of what to expect.
 
Two things:
First, what we don't know yet is the actual power consumption of the RX 480. One 6-pin connector only means >75 watts. It still could be VERY low.
Second, AotS is a showcase benchmark for AMD, yes. And here, we will probably see the least performance increase (uplift? ;)), because honestly, we expect to see improvements to the weak points of the architecture and AotS mostly shows off the strong ones. So, perf slightly above Hawaii XT could be the lower end of what to expect.
AMD said 150W.
http://www.dsogaming.com/wp-content/uploads/2016/06/Radeon-RX480-feature.jpg
 

Attachments

  • upload_2016-6-2_13-44-42.png
    upload_2016-6-2_13-44-42.png
    132.5 KB · Views: 25
Arent the clocks attributable to finfet nodes?.


some of the clocks but very little, Hexus article on this they asked the lead engineer on Pascal, about the clocks, they had to do extensive changes to the architecture to get those clocks.

http://hexus.net/tech/reviews/graph...gtx-1080-founders-edition-16nm-pascal/?page=2
Jonah Alben, who oversaw Pascal, said to us that a huge amount of work had been done to minimise the number of 'violating paths' that stand in the way of additional frequency. This is critical-path analysis by another name, where engineers pore over the architecture to find and eliminate the worst timing paths that actually limit chip speed. If successful, as appears to be the case here, the frequency potential is shifted to the right, or higher. Alben reckoned that Nvidia managed a good 'few hundred megahertz' by going down this path, if you excuse the pun, so Pascal is Maxwell refined to within an inch of its life.
 
Last edited:
The card can physically draw 150W from the power sources. That means that in the worst case scenario it has to consume less by a safe margin than that.
 
Just "Power: 150W". That isn't very specific. Is it powertune limit? Or TDP, or total board power, or typical gaming power, or just PCIe specification (75 W via PCIe + 75 W via 6pin)? AMD used a very vague formulation - maybe they have a reason. We'll see in less than four weeks.

Usually there's some breathing room. If it really does push things to the point that board power is right up to the expansion slot spec, that would be worrisome.
The Powertune target historically is lower, and generally has to be. The power delivery to the GPU is not perfect and other consumers of power take some out of the overall board power. The PCIe spec is not absolute about crossing the limit, but having a measurable chunk of consumption stuck above the power limit would stretch things to meaninglessness.
 
When was the last time you saw a graphics card draw exactly the amount of power provided by pcie-slot + power connectors? (excluding one dual gpu abomination which goes comfortably over what it shoudl draw)
GDDR5 7750 variant. Rated 75W, PCIe only, same for R250. R265/270 (150W with 1 6pin). GTX 640 (GK107 GDDR5) was PCIe at 75W.
 
Status
Not open for further replies.
Back
Top