AMD: Speculation, Rumors, and Discussion (Archive)

Status
Not open for further replies.
https://www.computerbase.de/2016-06/radeon-rx-480-test/12/

Was this posted here already? ComputerBase got better performance by undervolting their sample
Well that should be obvious. The card never hits the max clock (at least not their sample with this particular benchmark), always limited due to power target. Lower the voltage and it can increase the clocks a bit more - the power draw didn't budge at all.

FWIW by the looks of it if you'd restrict the power target so it effectively doesn't reach the higher P states efficiency should be quite a bit better (at P3, frequency is 1075Mhz and voltage 0.968V, whereas at P7 frequency is 1266Mhz but 1.15V - meaning a 19% voltage increase for a 18% frequency increase which should translate into roughly 67% higher power usage for this 18% performance increase). I haven't seen tests playing with that, though...
 
so far 7 sites have seen this where power goes above the 150 watts max. I'm sure there will be more. This doesn't seem to be a one off occurrence.
I'm talking about the duration and magnitude that the board spends above the limit, relative to what is considered compliant by the PCIe spec. Temporarily going above the limit is not sufficient to fail, at least from my faded recollection of when I last saw discussion on this. What counts as "temporarily" or is too significant is another item that had a bit of leeway, and what counts as a representative workload for the purposes of the certification process.

I've only seen the specific slot power draw numbers from Tomshardware, which seems like it could be large enough and possibly sustained enough to possibly not be excused by a claim of "power virus" or otherwise non-representative scenario.
 
As I said before,AMD marketing department needs a full wipe and get a fresh start. Idk who was responsable for the "strategy" but that person really needs to be replaced with most of him/her coworkers. As I said this is exactly why the hype is dangerous. You don't play with it unless you know for sure that you can win. AMD played and take a bet and lost completely and it exploted in their face.

Enviado desde mi HTC One mediante Tapatalk
 
As I said before,AMD marketing department needs a full wipe and get a fresh start. Idk who was responsable for the "strategy" but that person really needs to be replaced with most of him/her coworkers. As I said this is exactly why the hype is dangerous. You don't play with it unless you know for sure that you can win. AMD played and take a bet and lost completely and it exploted in their face.

Enviado desde mi HTC One mediante Tapatalk

hype ~ network effect ~ sales.
Not sure what else you expected them to do.

Curious how you would've shape the discussion? --without the hyperbole.
 
hype ~ network effect ~ sales.
Not sure what else you expected them to do.

Curious how you would've shape the discussion? --without the hyperbole.
There are ways to create and control the hype. AMD basically let it snowball out of control creating expectations even for the most advanced users (everyone in this thread,me included, though it was a 110w card... It ended being a 165w card even in the best case scenario it would be a 150w which we discuss were not optimal for a single 6 pin card. Shame on AMD for try to fool users on that aspect. It's clearly a card that needs a 8pin and chose to fool us in that aspect.

Enviado desde mi HTC One mediante Tapatalk
 
There are ways to create and control the hype. AMD basically let it snowball out of control creating expectations even for the most advanced users (everyone in this thread,me included, though it was a 110w card... It ended being a 165w card even in the best case scenario it would be a 150w which we discuss were not optimal for a single 6 pin card. Shame on AMD for try to fool users on that aspect. It's clearly a card that needs a 8pin and chose to fool us in that aspect.

Enviado desde mi HTC One mediante Tapatalk

so AMD should manage your expectations?
AFAICS:
  • 150W TBP - true
  • Equivalent performance to ~$500 cards - true
  • VR Ready - true
  • Multimedia functionality - true
  • Upto 2.8x power efficiency - true

I don't remember anything from AMD claiming that this is a 110W card and even if you thought so, This is a 40W difference from your expectation. I am failing to see how this is bad.

Can some one explain why this card is "disappointing?" --without the hyperbole.
 
As I said before,AMD marketing department needs a full wipe and get a fresh start. Idk who was responsable for the "strategy" but that person really needs to be replaced with most of him/her coworkers. As I said this is exactly why the hype is dangerous. You don't play with it unless you know for sure that you can win. AMD played and take a bet and lost completely and it exploted in their face.
Really need to wait and see how it sells before judging their hype and marketing. That is ultimately the goal of hype.

As for power there will always be transients that exceed the normal ratings. Not sure that applies in this case, but so long as they don't exceed it for significant periods it should be fine. Tried finding the spec but it's behind a pay wall atm. It's likely a board design issue, considering all the FETs are opposite of the power connector. So AIBs will likely fix that and AMD could revise the board for a newer revision.

I don't suppose any of the reviewers around here, or anyone with a card for that matter, can see if temperature has any bearing on performance? Idea being the card needs to be hot(80C+) to achieve low(~0.6V) voltages. Not sure how well WattMan would work for that unless you fixed the fan speed and power whlie dialing down voltage. Those voltages may be too low to drive the memory though.
 
Last edited:
b3d-poly.png


Still really behind on polygon throughput.
This is an extreme case with 100% culling. In the real world you draw some triangles so the numbers converge some.
 
so AMD should manage your expectations?
AFAICS:
  • 150W TBP - true
  • Equivalent performance to ~$500 cards - true
  • VR Ready - true
  • Multimedia functionality - true
  • Upto 2.8x power efficiency - true

I don't remember anything from AMD claiming that this is a 110W card and even if you thought so, This is a 40W difference from your expectation. I am failing to see how this is bad.

Can some one explain why this card is "disappointing?" --without the hyperbole.
7 review cards goes beyond that TDP while gaming. and you can argure "7 out of hundreads is not significan" but remember: GPUs chosen for reviews and even more for lunch reviews are the best of the best they could get and if 7 out of them are going beyond the limit...then thats a worrying trend. even if it doesn't we already talk that you can't sell a GPU without a safe margin in power consumption, in my opinion AMD have made a BIG irresponsibility on the way they are marketing this cards: This are mid-rage cards that are going to be place into mid to low-tier Mobos, and we know those are not the same quality as a high end one. I wouldn't be surprise if a low-tier Mobo would only be able to deliver 90W throw the PCI slot before starting to take damages.

Now here more the big irresponsibility made by AMD: they are marketing this VGA for multi-GPU configs. Why is this important? well... The 75W are not per slot but counting the power consumptions of all of them. So lets be clear: Even if the 480 consumes 150W that is going to be 75W out of each the PCI slot so they are socking 150W through the Motherboard...we talk about that cards typically as for less than 50W from the PCI slot, well this is 3 times higher. Now if we take into account that the cards may actually be asking for 80+W out of the PCI slot then things start to get a little scary.

now I don't know if AMD could change the way the card sock power via software(I doubt it)(taking more from the 6pin instead of the PCI) but even though they would be asking too much from mid to low tier Powers...I can't believe AMD actually made something this silly just in order to look prettier in the leaked pictures...I really can't believe it....

All of this would have not happen if only the reference card had an 8pin power connector instead of the 6. Or reduce the frequency of the card. in either way AMD decided to sacrifice the users hardware in order to make its product looks better in the leaks.

Now if you ask me if I would buy a 480 the answer is probably yes. I am very please with the level of performance it delivers for its price(or I could get an used 970 or 980 for good price) But I have mid-tier power supply and a high-tier MoBo and im not planning of doing multi GPU at the moment so I'm not worry for my hardware. But if you think about the market that this card is aiming and the way it is being marketed then there is the possibility that someone with a cheap motherboard and a cheap power supply try to do a CF with low assic 480 and try to OCed and then it could go wrong. Im not saying it will explode or something like it but, do you agree that it is OK for a company to take the risk of killing your hardware just to look better in a picture? I do not.
 
If customers do feel like expectations were poorly managed, then AMD is certainly partially responsible for that. They were the ones comparing it to the 4870.

Ouch, I hadn't seen that one before. I know that personally I had entertained that thought after reading some of the Videocardz and WCCFTech things, but generally refrained from actually believing in it. Pascal as a comparative architecture to GT2xx is just too good for AMD to have come up with something for Polaris to offer an equivalent comparison.

Regards,
SB
 
Last edited:
I can't help but wonder if people are getting worked up over nothing about how much power is drawn through the PCIE slot. Few people commenting on this have any real knowledge so it's fear mongering until someone runs tests to prove durability. Note that I count myself as someone without real knowledge in this area, but my first response isn't to assume product designers don't know what they're doing.
 
the updated/more detail 480 board review

Looks like the PCB its a very very good quality one, no cuts there.

Also standards/specifications are made for a reason.
 
RX 480 has the new Quick Reponse Queue allowing you to partition off some of the GCN units for your exclusive use, leaving the rest available for other tasks like graphics or other compute. It would be extremely interesting to run several standard FPS benchmarks and use this new feature to show RX480 performance with the full 36 CGN units enabled, with 30, with 18, with 2. I'm not even sure what conclusions we could make from such tests, but it'd be fascinating to see any differences from a purely linear scaling. It might also let us anticipate Polaris 11 a bit, though it'd be a optimistically simulated Polaris 11 with more memory bandwidth, cache, cooling, and power delivery than the actual smaller die will have.

The AMD blog post mentions that new drivers will have this new feature for older GCN 2.0+ GPUs as well.

This would require some AMD-savvy graphics programmer to write a small utility to make the desired GCN Quick Response Queue reservation and just idle it until the utility was exited.
 
I can't help but wonder if people are getting worked up over nothing about how much power is drawn through the PCIE slot. Few people commenting on this have any real knowledge so it's fear mongering until someone runs tests to prove durability.
Well, some are apparently out of spec on two fronts (both board and power supply). I don't really see too much of a real world issue on the power supply side given how short the cables typically are (the gauge used for the distance & rating is overkill). The connector itself might be more of an issue, but I would guess it is pretty overkill as well. EDIT: Yeah, they could easily bump the rating for 6-pin to 90w tomorrow and nothing bad would happen.... Honestly, could go even higher than that and still be "safe", but standards are standards and exist for a reason.

Cheap motherboards is what I would worry about... On the board side, for "typical" usage, probably will not be an issue either. But when people start overclocking on cheap motherboards, yeah I can see shortened lifespans and possibly even failures happening. Overclocked + Xfire would also be pretty scary to me. Thankfully, I am interested in neither.


Note that I count myself as someone without real knowledge in this area, but my first response isn't to assume product designers don't know what they're doing.
I don't know the internal situation, but operating out of spec is a pretty big #$%# up. Maybe the card was originally designed/intended to hit lower frequencies. Because I mean damn, if you are that close (and sometimes over) just use an 8 pin....
 
Last edited:
Continued rx 480 review commentary:

Didn't previously notice that guru3d puts the gtx 1080 at 30% faster than rx480 at warhammer 1080p ultra, and gtx 1070 at 21% faster. People are saying 10% overclocks are providing around 10~% performance boost, haven't checked that claim. But the rumors are AIBs might reach 26% OC, so if performance increases proportionally that would put it within 4% of a gtx 1080 at 1080p in some dx12 games and above 1070 in some dx12 games at 1080p.

Probably something's limiting the gtx 1080 at 1080p, but considering guru3d is using
Mainboard

MSI X99A GODLIKE Gaming

Processor

Core i7 5960X (Haswell-E) @ 4.4 GHz on all eight cores
I don't see any reasonable gamer, let alone budget gamer not being limited in a similar manner for the foreseeable future.
 
Continued rx 480 review commentary:

Didn't previously notice that guru3d puts the gtx 1080 at 30% faster than rx480 at warhammer 1080p ultra, and gtx 1070 at 21% faster. People are saying 10% overclocks are providing around 10~% performance boost, haven't checked that claim. But the rumors are AIBs might reach 26% OC, so if performance increases proportionally that would put it within 4% of a gtx 1080 at 1080p in some dx12 games and above 1070 in some dx12 games at 1080p.

Probably something's limiting the gtx 1080 at 1080p, but considering guru3d is using
I don't see any reasonable gamer, let alone budget gamer not being limited in a similar manner for the foreseeable future.
The comparative is interesting but won't be a real case scenario. No1 will play at 1080p on a 1080 and no1 would(not saying that won't) play higher than 1080p on a 480.

The key of the succeed of the customs 480 would be the price. One 480 of 4GBs at 1400 for 240 dollars would be a beast. for 300 would be a waste.
 
I don't think it is quite that bad.... die size seems reasonable to me. The geometry performance/unit is definitely improved (just wish they had recognized that as an opportunity to lead not just catch up). The performance/$ is very good. I don't even really care about the power consumption (though I can see this being an issue for mobile applications), just the heat and noise that goes with it. I was planning on buying a reference one, but now I think I'll wait for the custom designs and the GTX1060. I'd rather have 8GB of memory than 6GB, but if the 1060 is both cooler and quieter that will be pretty hard to resist.

The 150w+ on 150w issue is kinda odd as well. I mean why even fly that close to the sun...

Feh, in terms of die space and tdp AMD was close onto Nvidia back with the Fury X/980ti. Now silicon for silicon and watt for watt the Rx480 is significantly behind. But that doesn't seem to matter when all you can see are "out of stock" signs all over the place https://www.nowinstock.net/computers/videocards/amd/rx480/
 
I can't help but wonder if people are getting worked up over nothing about how much power is drawn through the PCIE slot. Few people commenting on this have any real knowledge so it's fear mongering until someone runs tests to prove durability. Note that I count myself as someone without real knowledge in this area, but my first response isn't to assume product designers don't know what they're doing.
I don't think it's a big deal in practice (as in: it probably won't kill any motherboards), and I don't think something like a PCIe committee has a lot of teeth to do something as drastic as revoke a sticker.

But it's easy to see how something might have slipped through the cracks: an independent certification lab are usually paper pushers that go through a checklist. They're not going to look for errors, so it's sufficient that someone at AMD didn't pay attention during measurements.
 
Yeah, I didn't realize the transistor count between Polaris 10 and GP104 was as close as it is. I know potato pototo, but the GTX1080 achieving the performance it does with +26% transistors is impressive...
 
Status
Not open for further replies.
Back
Top