AMD Vega 10, Vega 11, Vega 12 and Vega 20 Rumors and Discussion

They did. In the Pro Render/Blender Demo, didn't they?
As lanek pointed out, OpenCL and graphics would be a bit different. Pro Render/Blender, with all the work AMD has been doing there, would be designed to scale. As they had a platform with 64 PCIE lane and 4 GPUs already there, why not reuse that as opposed to a dual Crossfire configuration? As no performance figures were given, there was nothing to give away. There aren't a lot of people with a Threadripper and 4 GPUs to present a relevant comparison either.

And the cherry on top is they didn't show performance numbers, but forgot to turn vsync on.
I get that you don't want to turn vsync on if you want to show raw FPS output, but if you're not showing FPS numbers then you're simply being stupid or very ignorant by showing tearing in a demo made for tens of thousands of potential customers.
There are two sides of that though as AMD was demonstrating both a CPU and GPU. At the moment the CPU appears their focus, so VSync off makes more sense.
 
IIRC, they did announced back in december that Vega would come in H1 17. And they added something in the lines of "that doesn't mean they'll launch just at the very end in June" .
 
As lanek pointed out, OpenCL and graphics would be a bit different. Pro Render/Blender, with all the work AMD has been doing there, would be designed to scale. As they had a platform with 64 PCIE lane and 4 GPUs already there, why not reuse that as opposed to a dual Crossfire configuration? As no performance figures were given, there was nothing to give away. There aren't a lot of people with a Threadripper and 4 GPUs to present a relevant comparison either.


There are two sides of that though as AMD was demonstrating both a CPU and GPU. At the moment the CPU appears their focus, so VSync off makes more sense.

In addition, i dont think they really want give straight numbers when the launch is set for late July (nearly in 2 months )... specifically if they are still tunning driver and clockspeed.

anyway, june, july august will be charged for the AMD marketing team... Threadrippers, FE, RX Vega, additional product with Ryzen ( mobile ) etc etc... We can imagine some choice have been made.

The good news, this make a lot of products, hardware to be launched just before the end of holidays and the back to school period.
 
Last edited:
If we get watercooled reference cards with a reasonably quiet pump, maybe less of us will feel the need to wait for custom boards?
 
Should have shown off 4 cards crossfire if they're going to do that.

Something to speculate on :

lFKALdd_d.jpg
 
The real question is what are they concealing with Vega at this point?
My guess - waiting for the promised (2016?) 2Gbps HBM2. If they launched with only 409 GB/s bandwidth, just think of all those 5th gen GCN cores, twiddling their lil' silicon thumbs, waiting for data...
Good idea to wait.
 
My guess - waiting for the promised (2016?) 2Gbps HBM2. If they launched with only 409 GB/s bandwidth, just think of all those 5th gen GCN cores, twiddling their lil' silicon thumbs, waiting for data...
Good idea to wait.
There's no reason to believe 2 Gbps HBM2 isn't available, it already appeared in catalog nearly a year ago to become available within 2 months of the catalog update before it "mysteriously disappeared". We also know AMD has ~1.88 Gbps (probably 2 Gbps) 8-Hi chips even though no manufacturer has had 8-Hi chips in their catalog in any speed.
 
Last edited:
Should have shown off 4 cards crossfire if they're going to do that.

Something to speculate on :

lFKALdd_d.jpg
Sampling 4 frames in a single refresh? That's probably a 30Hz stream sampling and 120fps render without vsync. If it was a 60Hz stream then it's impressive.

My guess - waiting for the promised (2016?) 2Gbps HBM2. If they launched with only 409 GB/s bandwidth, just think of all those 5th gen GCN cores, twiddling their lil' silicon thumbs, waiting for data...
Good idea to wait.
That wouldn't effect engineering samples as they could source faster memory. Even then the bandwidth shouldn't be that much of a concern. With presumably larger caches, tiling, and ROPs on L2, the bandwidth requirements should be significantly less. It's more likely they're concealing performance relative to Volta while finishing off the drivers. AMD still hasn't said much about the NCUs and caching beyond FP16 rates and there are only so many reasons to still keep that secret.
 
It's a 30Hz stream. It's a stupid little thing, but NV did 60Hz and I think that's what everyone should be doing for anything involving GPU demos precisely for this reason.
For a live stream 30 isn't unreasonable. YouTube videos of demos, along with higher bitrate, I'd agree with that sentiment. IMHO it was more of a CPU demo with a GPU tacked on anyways. Now they have their own high-end GPUs as opposed to Nvidia's which were used when Ryzen was unveiled.

 
I thought the demo is more about threadripper than Vega but that's just me. The only takeaway regarding Vega is that they are using Vega instead of Titan Xp, when they first did demos of Ryzen, they used nvidia.

The reason they used 2 cards probably comes down to the line where they said something along the line of "this will the ultra enthusiast system to game on in the summer." Which 2 cards make sense, especially when they are demoing a CPU that is likely $1000 if not higher. If RX Vega is $500 - $700, it would make a lot of sense for these systems to have more than 1.
 
As lanek pointed out, OpenCL and graphics would be a bit different. Pro Render/Blender, with all the work AMD has been doing there, would be designed to scale. As they had a platform with 64 PCIE lane and 4 GPUs already there, why not reuse that as opposed to a dual Crossfire configuration? As no performance figures were given, there was nothing to give away. There aren't a lot of people with a Threadripper and 4 GPUs to present a relevant comparison either..
You did not specify. :)
 
I thought the demo is more about threadripper than Vega but that's just me. The only takeaway regarding Vega is that they are using Vega instead of Titan Xp, when they first did demos of Ryzen, they used nvidia.

The reason they used 2 cards probably comes down to the line where they said something along the line of "this will the ultra enthusiast system to game on in the summer." Which 2 cards make sense, especially when they are demoing a CPU that is likely $1000 if not higher. If RX Vega is $500 - $700, it would make a lot of sense for these systems to have more than 1.



The choice of game was pretty much horrid nonetheless. It's a game using the old DX11 version of CryEngine which practically doesn't scale beyond 4 cores and runs at 4K effortlessly on single cards from the competition.
I see Raja is saying they're were just showing Threadripper's IO capabilities, but isn't that useless when they had already shown 4 cards running Blender?
And if they're omitting the framerate, why not run the game at a ridiculously high resolution like 8K or 3x4K monitors?
 
The choice of game was pretty much horrid nonetheless. It's a game using the old DX11 version of CryEngine which practically doesn't scale beyond 4 cores and runs at 4K effortlessly on single cards from the competition.
I see Raja is saying they're were just showing Threadripper's IO capabilities, but isn't that useless when they had already shown 4 cards running Blender?
And if they're omitting the framerate, why not run the game at a ridiculously high resolution like 8K or 3x4K monitors?

Because ProRender is the raytracing engine by AMD, and they was proud to show it. , they are highly participate on Blender Fondation and Blender developpement ( viewport openGL, cycles developpement ).
Honestly for raytracing, a 16cores + 4 GPUs is a dream home system . ( i have just to look at the comment on different CG forums ). ..

As for Prey, it is one of the most popular games who have been released since the start of 2017. They work with Bethesda for it.

I dont say that this demo was not useless, just im not sure the point for them was to show you the performance of it with this demo 2 months before launch.
 
Last edited:
What would have been at least somewhat more impressive is if they'd shown Prey using a Vulkan rendering path to highlight their partnership with Bethesda (commitment to Vulkan for games). Granted, Prey started development way before the partnership and likely before iD commited to Vulkan for Doom, so Prey was highly unlikely to ever get a Vulkan rendering path anyway, but it would have been impressive had it happened.

But considering their partnership with Bethesda, this was probably the best they could do. They could have done Doom, but Doom is no longer in the spotlight. Fallout 4 would have been an even worse choice. Dishonored 2? Meh.

Regards,
SB
 
I saw this new article on Vega 11, but I don't think there's any actual content (I shouldn't be surprised...) other than speculation that nearly any frequent viewer of this thread could easily make.

http://wccftech.com/amd-radeon-rx-vega-11-mainstream-gpus-late-2017-launch/

I'm not normally one of those "don't give them clicks, whine-whine-whine" guys, but in this case, I think it might be justified, hence:

AMD might be releasing the Radeon RX Vega graphics cards for enthusiast gamers in Q3 2017, but there’s no word on the mainstream cards. A source over at PCGameshardware has revealed that Radeon RX Vega based on mainstream GPUs won’t see the light of day till late 2017 or early 2018.

It looks like AMD has no plans to replace their mainstream Radeon RX 500 series cards with Vega based offerings anytime soon. According to reports, while the Radeon RX Vega graphics cards for enthusiast gamers will be launching in Q3 of this year, the mainstream lineup is planned to ship in Q4 2017, with a possible launch time frame of Q1 2018.

Yesterday, AMD finally gave us a release date for the Radeon RX Vega enthusiast graphics chips featuring Vega 10 GPUs. These graphics cards will feature up to 4096 stream processors, HBM2 VRAM and several new technologies such as HBCC (High Bandwidth Cache Controller). Details on these technologies are available here.

These graphics cards will be aimed at the enthusiast market with prices topping the $500 US range. But AMD is known to have two Vega chips, Vega 10 is the bigger while Vega 11 is the smaller of the two. The Vega 11 GPU will fill in the mainstream section of AMD’s Radeon RX Vega lineup but it will not launch until later this year or even slipping to CES 2018. The reason is that AMD will be entirely focusing on getting their high-end parts out as soon as possible. The Radeon RX Vega will be AMD’s first high-end graphics card offering in more than two years after the Fiji based Radeon R9 Fury X.

Since then, AMD has offered no graphics cards in the enthusiast market and NVIDIA has gained a lot of market in this segment with multiple enthusiast cards above the $299 US price range. Cards such as the GTX 1070, GTX 1080, GTX 1080 Ti, GTX Titan X (P) and GTX Titan Xp are readily available for enthusiasts and high-end gamers to purchase. AMD has basically given NVIDIA full control of the enthusiast market by not launching their own high-end solution in over two years.

So while the enthusiasts would be pleased to learn that AMD is launching Radeon RX Vega at Siggraph, starting 30th of July, the other story is when can consumers buy a more affordable Radeon RX Vega graphics card? The current Radeon RX 500 family covers the mainstream line market with the RX 550, RX 560, RX 570 and RX 580. It’s a full fledged lineup under $300 US.

AMD doesn’t want to replace it so soon as it was just launched a few months back. This is why we have to wait till later this year to get news on the Vega 11 based models. Vega is indeed coming to gaming PCs but only in the high-end market. Meanwhile, NVIDIA’s Pascal GPU lineup covers the entire graphics market with entry level, mainstream, high-end and enthusiast aimed products.

There’s also some discussion if the upcoming Radeon RX Vega graphics card launch at Siggraph 2017 would be a hard launch or paper launch. If it’s confirmed to be a paper launch, users will be waiting a few more weeks to get cards in their hands.

I feel like I'm missing the "exclusive" part of this article.

If we get watercooled reference cards with a reasonably quiet pump, maybe less of us will feel the need to wait for custom boards?

I'd bet my left testicle that AMD will release a water-cooled halo SKU for at least the price of Nvidia's top card (e.g. $700 for 1080 Ti, ??? for GV104, etc). They have too much expertise and supplier relationships in CLC not to utilize them.

Now that doesn't mean that AMD won't go for an air-cooled variant (with lower clocks) at a lower (-$100?) price to win the price/perf race. God knows that they need to own the price/perf crown after abandoning the high end for two entire years.

Should have shown off 4 cards crossfire if they're going to do that.

Something to speculate on :

lFKALdd_d.jpg

I haven't seen anyone mention that the brief clip shown in Prey is probably the single most visually intense scene in the entire game (i.e. most enemies/NPCs on screen at once).
  • They probably picked it because they wanted something remotely stressful for their flashy 16C Vega10X2 system that still used the newest Bethesda title.
  • The player tried to extend the scene as long as possible by using a non-lethal "glu" gun and letting the AI finish it.
My guess is that AMD was making the best of an awkward situation caused by a presumed agreement with Bethesda and the choice to use a really beefy system.
 
I don't know whether it's the 'single most visually intense scene in the entire game' but Prey can be demanding, computerbase got 64 fps on 1080Ti with a demanding scene and that was before the 25-30% performance hit with the 1.2 patch.

 
Back
Top