PCR: AMD slams tragic Gameworks

Status
Not open for further replies.
One thing that gets overlooked frequently with Gameworks is that it acts as quite an elegant way of freeing Nvidia from limitations of DirectX.

Now, I don't know how the specification process of a DX version is happening in cold, hard reality (I know the tidbits that are presented to media as a sorry excuse for the truth...), but I could imagine that engineers and marketeers alike would seek a solution when their products get the short end of DX spec-bundles? Some might come up with a whole new API, potentially allowing for graphics features that go unused in a DX environment to be put to use. Others might want to use the way of black boxes to obscure not only their algorithms to the competition, but also to the limiting API - especially when they see their competitors products in major game consoles, so that the techlevel is basically set there. Because - my guess - it is highly unlikely that we will soon see games making use of optional DX12 FL12_1 features, they can go forth and put it to use via their black boxes.

After all, it was Richard Huddy who proclaimed years ago that "The API has to go away".

I agree though, that Nvidia should play this game more transparently if only for their own image. But then, they are a public company and most of those are driven rather short term …


Incidentally, previous to the Vulkan fork and Microsoft releasing DX12, Mantle got the support from 8 very high-profile games during its very short ~1 year of existence 2014-2015, whereas PhysX was used in 4 games in the same time period.

While I won't defend Physx since personally, I don't like the artistical way of implementation in almost all games I've seen (exception: Cellfactor!) and would rather prefer gameplay-enhancing physics effects (which won't happen for GPU-Physx, see Cellfactor), I think that Mantle List is a bit misleading (as is the comparison with Physx in the first place).

8 of those Games are based on the same engine (Frostbite), so they basically have Mantle baked into their DNA anyway. 4 games are not yet released. So, it's the Frostbite Games, Civilization (LORE), Thief (UE3) and Sniper Elite (Asura) - which is a good score all by itself, mind you.
 
Last edited:
Yet Huddy proudly proclaimed that over 100 game development teams had signed up for Mantle...so a couple of handfuls of games before the program was euthanized probably isn't overly impressive considering the lofty claims that were announced beforehand.

Looking at the formally announced support for Mantle in many games and the reported development time needed for Mantle once the engine supported it, I don't doubt there were over 100 dev teams asking for the Mantle SDK.
Obviously, with the announcement of DX12 and its release-after-announcement made in a record speed of 16 months, all the teams with a 3-4 year development time dropped Mantle support for a cross-IHV API.
And the ones that didn't drop Mantle for DX12, dropped it after the announcement that Mantle was going to turn into Vulkan (duh).


What does PhysX have to do with this thread? It wasn't even mentioned in the article let alone by anyone in the thread prior to your shoehorning it into the proceedings.
So you're bothered that PhysX got shoehorned into this thread, but bringing Mantle first in order to slam AMD was fair game, uh?
 
Here is the other problem for AMD, Gameworks features go hand in hand with each other, everything from physics to the graphics, to tessellation. So using more than one of the libraries the developers get more advantages. If Huddy thinks its bad right now, just wait and see when a game comes out using 4 or 5 Gameworks features.
 
Similar in many ways to how Microsoft/Apple should be bundling their own browsers in their operating systems and making opposing browsers run worse by not giving them access to key information...oops. Sorry they aren't allowed to do that but Nvidia is.

Regards,
SB

No, it's not similar in the least. Nvidia doesn't control the DirectX eco-system the way Microsoft has a death grip on the PC OS. Nvidia has very limited leverage to "force" anything on anyone. The market can turn its back on them in a flash. I know that doesn't play into the evil monopolist conspiracy theories but they're called theories for a reason.

You can argue that they don't play nice or aggressively press their advantage to maximize profits (duh) but to suggest they are strong arming the market is disingenuous at best. They couldn't do that even if they wanted to.
 
The real loser is the customer here, noone else.

I've seen this posted for years yet nobody has ever articulated a compelling argument as to how I as a consumer has lost anything due to competition. The basic argument that "proprietary features" are inherently bad for the consumer is kinda ridiculous when looking at consumer goods as a whole.

An even sillier argument claims that in the absence of IHV involvement we'll all get better games. That is just wishful thinking based on nothing at all.
 
Going along the lines of Trinibwoy, we haven't seen much effect at all due to any of these dev rel programs in the area of marketshare swings, they might have some effect but its intangible when we see historic marketshare numbers. The only time nV or AMD/ATi have had any major swings with due to lack of competition or competitive parts. So this "bad for the end user because it alienates them for choosing a certain hardware brand" just gets thrown out the window.

Adding in a few extra visual effects is not the end all of gaming, consumers buy games if games have good gameplay, story etc. Visual effects can be eye catching but that doesn't hold a consumer for very long if the game isn't good.
 
Huddy was making the same noises even when AMD had their noses ahead after the 5xxx series launch which was much akin to what nvidia have done with 9xx series. The constant being nvidia's hobnobbing with developers that ends up crippling performance on AMD cards. The recent Hocp review of Fury again providing instances of it.




In this test we have lowered the settings to a non-bottlenecked "Best Quality" setting. ASUS STRIX R9 Fury is 3% faster than GeForce GTX 980.

Now we have disabled GameWorks features altogether in this game below. We have turned off HBAO and are just using SSAO. We have turned off the NVIDIA Depth of Field option as well.








Performance jumped up on the ASUS STRIX R9 Fury. The Fury is now 31% faster than the GeForce GTX 980. The setting holding back performance seems to be the NVIDIA Depth of Field in this game. The GTX 980 can render it much better, the Fury not so much.

http://www.hardocp.com/article/2015/07/10/asus_strix_r9_fury_dc3_video_card_review/5#.VcDDHC6deZ8

And on Far Cry 4 as well,

http://www.hardocp.com/article/2015/07/10/asus_strix_r9_fury_dc3_video_card_review/6#.VcDDPS6deZ8

After the crysis 2 fiasco, nvidia are pretty much guilty until proven innocent.
 
I've seen this posted for years yet nobody has ever articulated a compelling argument as to how I as a consumer has lost anything due to competition. The basic argument that "proprietary features" are inherently bad for the consumer is kinda ridiculous when looking at consumer goods as a whole.

An even sillier argument claims that in the absence of IHV involvement we'll all get better games. That is just wishful thinking based on nothing at all.
What about you get a broken product if you use it on competitor hardware, that doesn't count as a loss ?
Isn't that forcing you to purchase that crippling IHV hardware ? an IHV which almost does have a monopoly on high-end GPU ?

What if competitors started to do that too ?
We would have to buy multiple GPU to have the games we want to run well on our machines ? How would we not lose in that case ?

We made games before IHV got involved thank you very much, we don't need them involved other than drivers and research papers.
 
Status
Not open for further replies.
Back
Top