AMD Mantle API [updating]

Specialized vertex and constant buffers
In the Mantle API, specialized vertex and constant buffers are removed in favor of more general buffer support. An application can use regular buffers to store vertex data, shader constants, or any other data.
So Mantle indeed seems to map quite well with our engine. We don't support "legacy" buffers either :D
 
CBBu9COWwAAPzZB.jpg:large


Delete if old. :)
 
I will ask me if this is not the same guys, team who have wrote the documentation for both lol. ( i joke, but well, maybe im not so far )
 
For those who have read through the API guide that Jawed posted a link to above, have you come across a section in regards to accessing the HTILE buffer? I skimmed the whole document and then did a few keyword searches to no avail. I was under the impression that access to the HTILE buffer would be available in mantle... is that still happening?
 
Interesting read ...
And what of these claims that Microsoft would have copied Mantle? There was even some claim that the documentation was mostly the same, with some screenshots of alleged documentation of both, and alleged similarities. Now that the final documentation is out, it is clear that the two are not all that similar at the API level. DX12 is still using a lightweight COM approach, where Mantle is a flat procedural approach. But most importantly, DX12 has some fundamental differences with Mantle. Such as the distinction between bundles and command lists. Mantle only has ‘command buffers’. Again, it looks like Mantle is just a simplified version, rather than Microsoft cloning Mantle.
https://scalibq.wordpress.com/2015/09/02/directx-12-is-out-lets-review/
 
Didn't read it through yet, but does he have something between his teeth about AMD?
Now, if we move from the software to the hardware, there are some other interesting peculiarities. As I already mentioned earlier, AMD does not have support for feature level 12_1 in their latest GPUs, which were launched as their official DirectX 12 line. I think even more telling is the fact that they do not support HDMI 2.0 either, and all their ‘new’ GPUs are effectively rehashes of the GCN1.1 or GCN1.2 architecture. The only new feature is HBM support, but nothing in the GPU-architecture itself.

I get this nasty feeling that after 5 years of AMD downplaying tessellation in DX11, we are now in for a few years of AMD downplaying conservative raster and other 12_1 features.
He handily disregards how old AMD's DX12/D3DFL12_0 supporting chips are (and even GCN1.0 is just 1 feature shy of D3D12 FL12_0), and that Microsofts own Xbox One is GCN 1.1 too, and as he says, MS knew what DX12 was going to be when they designed DX12.
FL12_1, if anything, seems to be "afterthought" added by request from NVIDIA.
Oh, and he mentions that Intel supports higher tiers than NVIDIA, but disregards the fact that AMD does the same on several features, too.

Reading a bit more, the async part also gives the feeling it's just a rant against AMD and dev supposedly "supporting AMD"
 
It appears that guy has had a long history of being fairly hostile to AMD or anyone that says anything good about AMD.

He had a blog post near when DX12 was announced by Microsoft, and the gist of it was? Nvidia is fantastic for Dx12, AMD sucks.

While he may have some relevant and insightful things to say, it sometimes gets drowned out by his hatred for anything AMD.

Regards,
SB
 
Scali had an account here and he indeed had a hatred for anything ATI/AMD. Hasn't posted here since the fermi days, you can infer why.

It's also not surprising the kind of people who would take the time to find him and quote him.

This seems apropos

W5s2JAH.jpg
 
Last edited:
I never turn a "blind eye" to opposing points of view, because somewhere in the middle is the truth. Can you imagine taking anything from AMD/Nvidia at face value? Not really surprised to see extremely low # of retweets or likes on that tweet.
 
Last edited:
Yes, he was removed from this site for a reason. Personally I'd rank him as more vile than the console fanboys who keep insisting on secret sauce being buried in their console of choice. Likewise, I view any of his blogs as only meaningful if they are pointers to information originating elsewhere.
 
Is it really matter where Vulkan and D3D12 come from? Kudos to AMD for it, but in the end these APIs are independent from Mantle. Even if there are a lot of match.
 
Is it really matter where Vulkan and D3D12 come from? Kudos to AMD for it, but in the end these APIs are independent from Mantle. Even if there are a lot of match.
That's not actually quite true for Vulkan, it is actually based on Mantle, had there not been Mantle, it could look a lot different than it is now
 
My question is is Vulkan going to be more relevant to gaming than Mantle? ;) OpenGL isn't exactly popular in gaming. With Mantle at least AMD conned a few companies into putting resources into a render path to make their APUs and FX CPUs look better.
 
My question is is Vulkan going to be more relevant to gaming than Mantle? ;) OpenGL isn't exactly popular in gaming. With Mantle at least AMD conned a few companies into putting resources into a render path to make their APUs and FX CPUs look better.
Hasn't google thrown its weight behind vulkan? That alone could mean good things are in vulkan's future.
Aside, why use the word conned? I thought this was a forum for smart people to discuss the industry, why use such inflammatory language?
 
Google with throw its weight around anything for a short time then abandon it to community.
 
Back
Top