AMD: Pirate Islands (R* 3** series) Speculation/Rumor Thread

So what's the point of feature levels and tiers? Just marketing fodder?
To make life easier for developers by grouping together common features. If you stick to feature levels, you don't have to plan for a scenario where you have ROVs but no conservative rasterizer, or bindless but no tiled resources, for example.

Developers that want to go off the beaten path can certainly do so, but you can stick to feature levels if that will make your life more sane.

Edit: And I should probably note that there are a number of gotchas. In FL11_1 you can't access certain features unless you support other features, etc. Which is why NV supports most FL11_1 features (and they can be used) without supporting it all
 
Last edited:
Honestly I don't see why almost anyone should care about what level of DX/Shader model the new cards support. There really isn't a whole lot of useful features anymore, at least compared to what their used to be with the big updates. "DX12" is more interesting because of explicit control, which is backwards compatible, than any particular hardware feature or even the whole set of hardware features.

Direct3D feature levels discussion
 
AMD Confirms GCN Cards Don’t Feature Full DirectX 12 Support – Feature Level 11_1 on GCN 1.0, Feature Level 12_0 on GCN 1.1/1.2

Read more: http://wccftech.com/amd-confirms-gc...-10-feature-level-120-gcn-1112/#ixzz3cDXM9ujS

Nonsense to which the response is this:

Roy@AMD @amd_roy
@Ramzinho @Thracks zero, absolutely zero. AMD supports DX12. Period.

How is it nonsense? It's a straight forward fact, some GCN GPU's don't have FL12_0 support and none currently on the market have FL12_1 support. Which part of that is nonsense?

Also it's probably better if you link the specific tweet you're referencing.

Anyways - how many times in the past did we see how nvidia didn't support some versions of DX (DX 10.1, DX 11.2, DX 11.3, etc)?

Lots, but unless you have a time machine and intend to use it to go back to the past and live there, what does it matter now?

In this case the feature level doesn't even mean that those support DX12dot1 but DX12 feature level 12_1...

AMD DirectX 12 GCN Support:

Model
Graphics Core Next Architecture DirectX
Radeon HD 7000 series GCN 1.0 DX12, feature level 11_1
Radeon HD 7790 GCN 1.1 DX12, feature level 12_0
Radeon R7 260 (X) GCN 1.1 DX12, feature level 12_0
Radeon R9 270 (X) GCN 1.0 DX12, feature level 11_1
Radeon R9 280 (X) GCN 1.0 DX12, feature level 11_1
Radeon R9 285 GCN 1.2 DX12, feature level 12_0
Radeon R9 290 (X) GCN 1.1 DX12, feature level 12_0

This confirms what you just described as nonsense above.

NVIDIA Direct X 12 GPU Support:

Model
DirectX
GeForce 900 Series (Maxwell 2.0) DX12, feature level 12_1
GeForce 700 Series (Maxwell 1.0) DX12, feature level 11_0
Partial feature level 11_1 support
GeForce 700 Series (Kepler) DX12, feature level 11_0
Partial feature level 11_1 support
GeForce 600 Series (Kepler) DX12, feature level 11_0
Partial feature level 11_1 support
GeForce 500 Series (Fermi) DX12, feature level 11_0
Partial feature level 11_1 support
GeForce 400 Series (Fermi) DX12, feature level 11_0
Partial feature level 11_1 support

Yes but the big difference is that right now, all new GPU releases from Nvidia are Maxwell 2 and thus have full FL12_1 support. Meanwhile AMD are still releasing new products based on Pitcairn.

I have no ill will towards AMD at all, but I do think they're harming both themselves and PC gaming as a whole by re-releasing years old tech re-badged as something new, some of which that doesn't even have FL12_0 support. If that's actually what they're doing with the 3xx series of course. If it turns out the whole series is GCN1.2 or above I'll stand corrected.
 
Last edited:
To make life easier for developers by grouping together common features. If you stick to feature levels, you don't have to plan for a scenario where you have ROVs but no conservative rasterizer, or bindless but no tiled resources, for example.

Developers that want to go off the beaten path can certainly do so, but you can stick to feature levels if that will make your life more sane.

Edit: And I should probably note that there are a number of gotchas. In FL11_1 you can't access certain features unless you support other features, etc. Which is why NV supports most FL11_1 features (and they can be used) without supporting it all

That makes sense but unless the grouping of features serves some specific purpose when used together it seems pretty arbitrary. It would be really helpful to understand what new techniques are made possible when certain features are used together.
 
How is it nonsense? It's a straight forward fact, some GCN GPU's don't have FL12_0 support and none currently on the market have FL12_1 support. Which part of that is nonsense?
Where has this actually been confirmed? Which FL12_0 features do GCN 1.0 miss exactly?
 
How is it nonsense? It's a straight forward fact, some GCN GPU's don't have FL12_0 support and none currently on the market have FL12_1 support. Which part of that is nonsense?

Also it's probably better if you link the specific tweet you're referencing.



Lots, but unless you have a time machine and intend to use it to go back to the past and live there, what does it matter now?



This confirms what you just described as nonsense above.



Yes but the big difference is that right now, all new GPU releases from Nvidia are Maxwell 2 and thus have full FL12_1 support. Meanwhile AMD are still releasing new products based on Pitcairn.

I have no ill will towards AMD at all, but I do think they're harming both themselves and PC gaming as a whole by re-releasing years old tech re-badged as something new, some of which that doesn't even have FL12_0 support. If that's actually what they're doing with the 3xx series of course. If it turns out the whole series is GCN1.2 or above I'll stand corrected.

Yes but Maxwell 2 only goes down to the $350 level. There is still all the gpu's from $350 on down from NVidia that aren't dx FL 12_1 .

We will have to see what parts don't support 12_1 from AMD's new line up. It might be similar or better.
 
That makes sense but unless the grouping of features serves some specific purpose when used together it seems pretty arbitrary. It would be really helpful to understand what new techniques are made possible when certain features are used together.

Really this belongs in the feature level thread, but it's actually not arbitrary at all. It's based around what shared features each IHV can support (or will support in the near future). So for instance, 12_0 is clearly meant to equate to "console level". There's going to be a lot of interest among developers to target that grouping of features. It's nice to have a simple check "can this do basically everything the x1/ps4 can do?". 12_0 becomes the PC rendering path "equivalent" to consoles.

For 12_1 Microsoft got Intel, Nvidia, and AMD (and maybe qualcomm :p) in a room and asked "can your next generation gpus support x, y, z features?". If they all say yes then it makes sense to group those features together. For a developer it becomes hard to support various optional features (it creates many different rendering paths). If we can group several features together that we know most (new) gpus in 2016 will support, it allows us to cut down on the number of rendering paths that we have to use (which saves development time, qa time, etc.). Feature levels are about grouping gpus across various IHVs that have similar capabilities. Don't worry, Microsoft doesn't determine feature levels by throwing darts on a wall. :D Essentially the market dictates feature levels.
 
Where has this actually been confirmed? Which FL12_0 features do GCN 1.0 miss exactly?

It was in the link that I was responding to:

http://wccftech.com/amd-confirms-gc...-10-feature-level-120-gcn-1112/#ixzz3cDXM9ujS

DmitryKo has explained it in more detail on Wikipedia:

https://en.wikipedia.org/wiki/Direct3D#Direct3D_12_levels

It's lacking tier 2 tiled resources. Probably not a big deal but IMO unacceptable for AMD to be releasing a brand new GPU line up at this stage where the mid range GPU is the same mid range GPU they launched over 3 years ago with a different name. Assuming of course that Pitcairn is really going to form part of the 3xx range.
 
I wouldn't call random article proof without actual data behind it (like whatever software it was that shows the D3D features supported by card, with DX12 drivers on Win10)
It's lacking tier 2 tiled resources. Probably not a big deal but IMO unacceptable for AMD to be releasing a brand new GPU line up at this stage where the mid range GPU is the same mid range GPU they launched over 3 years ago with a different name. Assuming of course that Pitcairn is really going to form part of the 3xx range.
If it indeed misses Tier 2 then it's clear obviously, but has this been tested already on DX12 drivers with whatever that software was that shows the supported features?

As for Pitcairn, before calling it chip from 3 years ago, we should somehow get clear on was it upgraded right after HD7-series, as suggested by it's apparent OpenCL 2.0 support in newer series'
 
I wouldn't call random article proof without actual data behind it (like whatever software it was that shows the D3D features supported by card, with DX12 drivers on Win10)

If it indeed misses Tier 2 then it's clear obviously, but has this been tested already on DX12 drivers with whatever that software was that shows the supported features?

Probably best to direct that question to Ryan or DmitryKo in the Direct3D feature levels thread:

https://forum.beyond3d.com/threads/direct3d-feature-levels-discussion.56575/page-12#post-1848939

As for Pitcairn, before calling it chip from 3 years ago, we should somehow get clear on was it upgraded right after HD7-series, as suggested by it's apparent OpenCL 2.0 support in newer series'

Unless it supports tier 2 tiled resources, TrueAudio and Freesync then it's not enough IMO. Really the entire range should be at least GCN1.2 but we know that's not going to happen.
 
How is it nonsense? It's a straight forward fact, some GCN GPU's don't have FL12_0 support and none currently on the market have FL12_1 support. Which part of that is nonsense?.

Full DX 12, there is no such thing as full or partial DX 12 support.

Here is your link: https://twitter.com/amd_roy
5:11 AM - 6 Jun 2015
Roy@AMD retweeted
"Robert Hallock ‏@Thracks 6h6 hours ago
@Luraziel @rluik @amd_roy It's simple: feature levels don't matter. Games don't require them. You either support DX12, or you don't."

Ohh, and you never know about DX 12 feature level 12_2 for example,
how do you know that DX 12 with feature level 12_1 is the final?
 
Full DX 12, there is no such thing as full or partial DX 12 support.

Here is your link: https://twitter.com/amd_roy

He's (probably deliberately) confusing DX12 (the API) support and feature level support.

All GCN GPU's support the DX12 API and so will benefit from the reduced overhead. GCN1.0 GPU's don't support FL12_0 so will not be able to take advantage if games utilise the features it lacks (just Tier 2 tiled resources so unlikely to be of any consequence). All GCN GPU's currently on the market lack support for the FL12_1 features and so will be unable to take advantage of any games that use them. Since all NV's new GPU's for the past 9 months and all Intels GPU's since Haswell support at least ROV's then I'd say there's a reasonable chance we could see games that utilise that feature in the future which would be a genuine disadvantage for AMD.

Roy knows full well that when the news sites are reporting that AMD doesn't support DX12 they are talking about feature level 12_1 but he responds to the question with an answer relating tot he API support rather than the feature levels.

As for that statement about "feature level don't matter", it's frankly laughable. AMD have been saying the exact oppoiste for years while they've been ahead ont he feature level front but now that they're behind it doesn't matter anymore? And how could he possibly know whether future games will make use of FL12_1 features or not?

Ohh, and you never know about DX 12 feature level 12_2 for example,
how do you know that DX 12 with feature level 12_1 is the final?

FL12_1 is as high as it goes for now, obviously there will be something beyond that in the future but right now that's it, and Nvidia supports it while AMD doesn't. I'm not sure what other way there is to interpret the situation.
 
Since all NV's new GPU's for the past 9 months and all Intels GPU's since Haswell support at least ROV's then I'd say there's a reasonable chance we could see games that utilise that feature in the future which would be a genuine disadvantage for AMD.
Nope, since you either support the feature level or you don't - supporting individual features of said feature level is irrelevant if you don't support all of them
 
Nope, since you either support the feature level or you don't - supporting individual features of said feature level is irrelevant if you don't support all of them
Even feature level could be irrelevant for developers. Feature levels are just a convenient way to check some well-defined sets of features.
If I know I am going to write a piece of code that take the advantage of ROVs to do something like an OIT algorithm, I will check for ROVs support in my rendering path and not for FL 12.1 unless I know that I need every FL 12.1 capabilities in my OIT algorithm, it would not make sense the opposite.
 
Back
Top