The AMD Execution Thread [2018]

Status
Not open for further replies.
AMD Vega 10, Vega 11, Vega 12 and Vega 20 Rumors and Discussion

So looks like 2018 will be devoid of any new GPU from AMD, leaving NVIDIA to release new high end SKUs, raising the bar even higher. Another year with another gap from amd, As even the 7nm Vega is an AI olly chip sampled at the end of the year to select few.

No new dGPUs to compete with Nvidia in 2018 will hurt AMD's bottom line and result in loss of discrete market share.
 
No new dGPUs to compete with Nvidia in 2018 will hurt AMD's bottom line and result in loss of discrete market share.
Discrete yes. How much will Raven Ridge, Vega Mobile and the Intel/Vega bring in for AMD though? The lower powered devices is certainly a much larger market than performance/enthusiast dGPU.
 
There's near zero reason for AMD to launch new desktop discrete GPUs this year.

Why would they, if all shelves are completely void of mid-to-high end AMD GPUs everywhere? The ridiculous cryptocurrency spike that happened during December made sure no one but miners will get any desktop AMD card for the next 6 months to 1 year.

AMD is selling all the desktop cards they're making regardless. There's even less pressure for them to update Polaris and Vega than there is for nvidia to update the Pascal line.
Things are crazy right now. Search for any AMD GPU on Amazon and here's what they recommend:

hDhkxCs.png



Probably, AMD is also being very cautious about ramping up GPU production because they wouldn't be gaining any gaming marketshare (they would all end up with these guys), and the moment there's a cryptomining crash we're getting a flood of used cards in the market and lots of brand new GPUs that no one wants because of all those used cards. The more cards they make right now, the bigger will be their sales drought in a year or so.

So why would AMD spend their resources updating e.g. Polaris with GDDR6 or a desktop "Vega M"? They already artificially pushed up the price of Polaris 10 when they re-launched it as RX570/580, so that's as far as they can go.



What AMD should do, and rightfully did, is focus on the mobile market, where purchasing a whole laptop won't ever be as profitable to a miner as getting a miner motherboard with a Celeron and connecting 8 discrete cards to it.
That's why we have a "Vega Mobile" and not a "Vega 32" or "Vega 24", and new APUs with integrated Vega.




Now, as for AMD's Execution, my current feeling is that CES has turned out to be a big middle finger to Ryzen Mobile.
Again, tens of new 2-in-1s and thin laptops with the quad-core Kaby Lage U and zero with Ryzen U.
In fact, the only new laptop I've seen announced with Ryzen U was this piece of crap with a 2700U and a RX560.
I mean.. you have a 15W APU that has the best integrated GPU in the market, and you put it in a 15" behemoth together with a discrete GPU?

This is Carrizo all over again.
 
Search for any AMD GPU on Amazon and here's what they recommend:
Wow, that's sad. And scary. :p

This is Carrizo all over again.
Give it some time. AMD name recognition will continue to improve with ryzen 2, and then the laptop demand will come as well. Rome wasn't built in a day and all that... (Of course, we all probably wish they'd hurry up just a bit, tho... :p)
 
also 2200G is an awesome performance/price point, makes every other budget option bad value. Will be interesting to see if they will be able to meet demand for that chip.
 
First 3dmark11 results from a $170 Ryzen 5 2400G pitted against an i5 7600 + GT1030 ($220 CPU + $75 GPU) and against an i7 7700 + GT1030 ($300 CPU + $75 GPU):

https://www.3dmark.com/compare/3dm11/12257178/3dm11/12554398/3dm11/12593862



I wonder what memory they're using in the 2400G.
Regardless, this is nothing short of game-changing. AMD is offering ISO gaming performance at practically half the price.



Give it some time. AMD name recognition will continue to improve with ryzen 2, and then the laptop demand will come as well. Rome wasn't built in a day and all that... (Of course, we all probably wish they'd hurry up just a bit, tho... :p)


Honestly, I don't think that's it.
CES has companies showing off products that are still half a year away. Kaby Lake G was announced a couple of months ago and it has more design wins on the CES floor than AMD with Ryzen Mobile.


Then pretty much all laptops SKUs with Ryzen Mobile are stupidly flawed in one way or another:

- ENVY x360 is 15" big and heavy but won't get the higher-end Ryzen 7 model .

- Acer Swift 3 is also 15" big and bulky but won't get DDR4 2400MHz and maxes out at 8GB RAM.

- Lenovo 720S finally has an interesting form factor at 13.3" and a little over 1Kg, but only offers single-channel DDR4 2133, also topping at 8GB.

- Acer Nitro 5 is just a terrible product. Looks like the thought process to make it was "OK, AMD paid us to get something with their new APUs so let's think of something terrible because we need to keep these guys in a bad light".

- Dell apparently has this Inspiron 5000 lined up that will get either the 2500U or 2700U, at 15 or 17".. and then you get the option to add a spectacular Radeon 530. Which is a GCN1 Oland with 6 CUs. Yes, some people will be paying extra to purchase a discrete GPU that performs significantly worse than the included iGPU.


All these flaws result in people either going with the typical intel+nvidia setup, or those who do buy these products will feel disappointed with its performance, further stigmatizing AMD's brand.


So what's happening so far is that AMD is simply getting thrown around by laptop makers by only getting design wins for super mediocre products that few people will want.
Again.
 
Last edited by a moderator:
First 3dmark11 results from a $170 Ryzen 5 2400G pitted against an i5 7600 + GT1030 ($220 CPU + $75 GPU) and against an i7 7700 + GT1030 ($300 CPU + $75 GPU):

https://www.3dmark.com/compare/3dm11/12257178/3dm11/12554398/3dm11/12593862



I wonder what memory they're using in the 2400G.
Regardless, this is nothing short of game-changing. AMD is offering ISO gaming performance at practically half the price.

While that is true, I fail to see why someone would bother buying a GT1030 for gaming. Last time I heard, that type of card was used more in HTPC for media decoding. Especially if the same person had the cash to get an i7-7700 CPU... Hugely unbalanced PC. The same person who does that probably has no idea about hardware and just goes by brand names. AMD brand has not been enjoying a good reputation for the past few years since Bulldozer...
 
Last edited:
eSports crowd, basically. The card is great for Overwatch, Rocket League, CS:GO, DOTA and Starcraft 2.
That's what marketing says, but is that really the case? CS: GO and DOTA2 I could imagine, Rocket League - not my league, Overwatch is much nicer in higher setttings and aiming improves considerably at 120+ Fps synced.

While that is true, I fail to see why someone would bother buying a GT1030 for gaming. Last time I heard, that type of card was used more in HTPC for media decoding. Especially if the same person had the cash to get an i7-7700 CPU... Hugely unbalanced PC. The same person who does that probably has no idea about hardware and just goes by brand names. AMD brand has not been enjoying a good reputation for the past few years since Bulldozer...
Hugely unbalanced seems to be the key here. Just put together to convey a message.
 
Overwatch is much nicer in higher setttings and aiming improves considerably at 120+ Fps synced.
People in eSports don't use higher settings. More bling and effects = more stuff for the brain to process = less attention to what it's really needed. Just 1080p low is fine.
As for 120+ FPS synced, I think you're already talking about a niche within the esports community.
 
People in eSports don't use higher settings. More bling and effects = more stuff for the brain to process = less attention to what it's really needed. Just 1080p low is fine.
As for 120+ FPS synced, I think you're already talking about a niche within the esports community.
But in any case, the key issue is pairing a $220 CPU with a $75 CPU. We need to see a comparison with $70CPU+$100GPU or similar.
 
I think the only time I found AMD appealing in notebooks was back when they launched Mobile Athlon 64. ;) That was awesome compared to Pentium 4.

Have there been any notebooks with an AMD CPU that were premium quality?
 
Last edited:
But in any case, the key issue is pairing a $220 CPU with a $75 CPU. We need to see a comparison with $70CPU+$100GPU or similar.

$75 gives you a 2-core/2-thread Skylake Pentium at 3.5GHz.
$84 gives you a 2-core/4-thread Kabylake Pentium at 3.6GHz

Let's go with the $84 Pentium. You now have $86 for a GPU. What do you get?
Hint: not a GT1050. You'd need $130 for the cheapest one.


So now we have a 4-core / 8-thread at 3.5-3.9GHz (unlocked for overclock) + Vega 11 Ryzen 2400G, against a 2-core / 4-thread locked at 3.6GHz (no Turbo) Pentium with a GT1030.
 
People in eSports don't use higher settings. More bling and effects = more stuff for the brain to process = less attention to what it's really needed. Just 1080p low is fine.
As for 120+ FPS synced, I think you're already talking about a niche within the esports community.
If I'm talking esports, I'm not talking about Joe Garage gamer who happens to play one of these titles, but people who play competitively. And those people of course use lowered details as you describe, but at the same time also enable anti-aliasing in order to - just like you said - have as little as possible artifacts that distract the eye. Of course, they do not turn down visibility range and other stuff that gives them a competitive advantage.

For some competitive titles, I find it hard to get sufficient fps with a 1050 Ti, so I'm having a hard time imagining to get that with a 1030. Yes, sure, you can get to 60-ish fps, but also in settings that real esports players would use? My experience differs from a plain "low".

So now we have a 4-core / 8-thread at 3.5-3.9GHz (unlocked for overclock) + Vega 11 Ryzen 2400G, against a 2-core / 4-thread locked at 3.6GHz (no Turbo) Pentium with a GT1030.
Changes the narrative though.
Regardless, this is nothing short of game-changing. AMD is offering ISO gaming performance at practically half the price.
Considering that you could also get an i3-8100 and a 1030 for ~200 US-Dollars if you think that you're severly hampered by a HTT-dualcore Pentium.

Plus, what 3DMark results conveniently shroud in veils: How good is dynamic power allocation working and for what frequencies are those 65 watts good, because in actual games it's not like CPU cores take a break when graphics are loaded and vice versa. I'm thrilled to see actual performance numbers in games.

--
I still think, Ryzen 3 2200G and R5-2500G will totally rock and be VERY hard to beat at their respective price points. But controlled leaks seemed to instill unrealistically high hopes for the last couple of generations regarding graphics, so I remain skeptical as to the actual extent of the "G's" advantage.
 
There's near zero reason for AMD to launch new desktop discrete GPUs this year... AMD is selling all the desktop cards they're making regardless.
Probably, AMD is also being very cautious about ramping up GPU production because they wouldn't be gaining any gaming marketshare (they would all end up with these guys), and the moment there's a cryptomining crash we're getting a flood of used cards in the market and lots of brand new GPUs that no one wants because of all those used cards. The more cards they make right now, the bigger will be their sales drought in a year or so.

If all desktop cards are consumed by miners, how would we know if AMD had already ramped up their production? I'd rather think they decided to remain on current process so they don't lose 2-3 months of production capacity while retooling the lines to a new 12nm process. The market is still booming - you can't basically buy ANY mid-range card from either AMD or NVidia, new or used, except in ready made PCs. I've just been to a local pick-up point of a large online store, and there was a pile of eight 1050 Ti waiting - that's in a remote suburban area.


As for the hypothetical bitcoin mining crash resulting in stockpiles of used GPUs - well, Moore's Law still holds. AMD would have 7 nm Navi parts by then, which should offer 1.5-2x better performance / power efficiency, so in this case things would just get back to normal, where used mid- and top-end cards (4-8 TFLOPS) go for $100-150$, new mid-range cards (6-12 TFLOPS) go for $150-250 and the high-end (12-15 TFLOPS) goes for $300-400.

I'd be more concerned if the aforementioned crash doesn't happen and AMD is still unable to ramp up production of Navi and hold their share of the gaming market.


Maybe AMD should consider a high-end hardware/software solution to cater for specific needs of the professional mining crowd, like increased power efficiency. For example, do they really need 8 GBytes of dedicated HBM memory per chip - would it be more cost-effective to offer down-clocked multi-chip solutions with a reduced amount of shared GDDR memory?
 
Last edited:
If all desktop cards are consumed by miners, how would we know if AMD had already ramped up their production?
We don't. I wrote they're cautious about ramping up, ie they're not frantically telling GloFo to go full throttle on Vega and Polaris because there are no GPUs on the shelves, for the reasons I mentioned.
They may have ramped up production, just not on a Nintendo Switch "OMG-we're-selling-so-much-more-than-we-anticipated-just-double-production!" level.

Regardless, I'd say their focus right now is on Vega M. During 2018, Kaby G alone may be putting much more Radeons in the hands of gamers than discrete Polaris and Vega combined.
Though I have say Kaby G is probably going to push some 1200 H/s on cryptonight using less than 100W, and that should make the laptop a much more interesting purchase for gamers who don't mind doing some mining in the laptop's idle time (especially if you can easily change the fans when they die).




I think AMD should consider making dedicated hardware/software to cater for specific needs of the mining crown, like increased power efficiency.
Making dedicated ASICs is too costly, risky and it would take too much time. Even if they had started to develop dedicated "blockchain-processor-units" (BPUs) back in mid-2017, a real product wouldn't be out until late 2019, and who knows if such a product would have demand by then.

As for SKUs with less video outputs and lower clocks, OEMs already have those and I think they weren't terribly successful because they didn't manage to properly optimize the voltages, they didn't make it significantly cheaper meaning the impact in the resale value might be larger than the initial price delta.
 
One of the shortcomings of Bitcoin is that is much more effectively mined with special purpose ASICs than general purpose CPUs or GPUs. The barrier of entry for a miner who wants to economically mine Bitcoin is thus high. That's a problem, because the validity of the blockchain is chosen by majority and when you have few, big, miners your currency is vulnerable to a few big miners pooling their resources to get a majority (so called 51% attack).

The new currencies tries to mitigate this by shaping the workload so that it fits with the capabilities of hardware in widespread use. E.g. Etherum has a big (and growing) memory footprint, which renders ASICs worthless (because the cost is primarily in the memory subsystem). Mining is piggybacked onto existing hardware.

As long as crypto currencies is an inflating market, we'll see big demand for hardware that mines well.

We'll just have to wait for the bubble to burst :)

Cheers
 
Most people with bitcoin is IMO just playing game of speculation for now and I'd say a majority of those have no idea how dangerous how dangerous bitcoin can be because of those few big miners coming together and suddenly owning the whole thing.
Though I guess that's exactly what ripple is (owned by banks nonetheless!) and even I know people who are happy purchasers of it.

I think cryptonight coins are great, though. Even tablets can be profitable at mining the thing. My Surface 4 Pro m3 has mined all day at a steady 70 H/s lol.
The fact that almost anything that runs an O.S. can be used to mine cryptonight coins makes them much more decentralized than anything else.
 
Status
Not open for further replies.
Back
Top