Hairworks, apprehensions about closed source libraries proven beyond reasonable doubt?

Status
Not open for further replies.
on the end, this is gamers, the consumers on forums who complain

A while back we were discussing how Gameworks was harmful as opposed to Mantle, which was benign. This is another evidence.

AMD spent probably a pretty penny on Mantle instead, and kept it closed until it became irrelevant in the PC space.

It's so closed that it's in DX12 and Vulkan. Mantle spurted the industry forward and I would argue it's the most relevant development on PC space in a while.
 
Its a bit harsh , the problem is we are turning around with this question ... does the game made with Gamework are made for favor Nvidia `? yes .. do they favor AMD no?

Its funny they will disapear of the hardware reviews as fast as Batman, Assassin Creed, FC4.. point .... end of the discussion,

But i hope this trend will stop or Ryan ( Anand ) will need to test 2016 gpu's with 2012 games ...
 
It's so closed that it's in DX12 and Vulkan. Mantle spurted the industry forward and I would argue it's the most relevant development on PC space in a while.
Microsoft took the ideas of Mantle, yes. How does that make Mantle open? And AMD noticed that Mantle was pointless in the PC windows space, so they gave it to Khronos.

But let's not forget that when Intel asked for Mantle they were denied access. And that they forever stalled releasing Mantle to the public. It's one thing to call something open, it's another to make it open. Calling something open is PR. Making something open means you allow your competitors to start working on it and giving up away your early advantage.

I am not contending in any way that Mantle didn't kick off a major development in the PC space. It obviously did. I'm just pointing out the incredible hypocrisy and holier-than-you rhetoric around it.
 
Intel have been denied of access to Mantle ? ... lol thats a news .. You have read that on Forbes maybe ???

I have read that AMD was honored to get Intel ask them for get access on Mantle ...

And i said that with all my respect to you.. i mean this is not an " somewhat attack "

As for open, as is the actrual situation, i will ask what Nvidia have ask, not intel.. We are dumb, and im the first one, we take history write on hardware sites, instead of take real words.. we are commenting jurnalist twists instead of ask the interested ..

The 28th may there's an open conference with AMD, maybe we should take this time and ask tem too..

I put the link for register, like that you do how you want with it. ofc Grasby is not really in the technical questtion, but, who know, he can maybe give you some response .. ( the poor, i give the pressure to him )

http://experience.amd.com/amdwebinar?utm_source=silverpop&utm_medium=email&utm_campaign=22692671&utm_term=Webinar-Header&utm_content=global-partner-webinar-meet-the-expert-may-invite-new-attendees-reminder (1):&spMailingID=22692671&spUserID=NzI2NzY4NzE1NTIS1&spJobID=562076587&spReportId=NTYyMDc2NTg3S0

Im nott sure that Gab can respond to this, but well,we can try.

Ofc this open webinar is more here for speak about the next AMD GPU's lineup..
 
Last edited:
Accusing your competition of sabotage because your own driver/hardware can't keep up is.

Don't pretend we're talking about drivers and hardware.
GTA V, Battlefield Hardline, Evolve, DA: Inquisition and even Witcher 3 without Hairworks show how drivers and hardware are competing, and no one is complaining about that.

We're talking about shady developer deals about using black box DLLs.
Were you really hoping to trick anyone here into thinking this is about drivers and hardware? In B3D?

I'm not sure if that was either very brave, very naïve or both.


And AMD noticed that Mantle was pointless in the PC windows space, so they gave it to Khronos.
AMD almost single handedly wrote the next most-widely-used, cross-OS, cross-platform 3D API (Vulkan) and you actually tried to pass that as a bad thing?
Wow.

What's next? Freesync is bad because it's open? Developing x64 was bad because 64bit is overrated?

I'm actually curious to know how far that reality distortion field can go.
 
Microsoft took the ideas of Mantle, yes. How does that make Mantle open? And AMD noticed that Mantle was pointless in the PC windows space, so they gave it to Khronos.

But let's not forget that when Intel asked for Mantle they were denied access. And that they forever stalled releasing Mantle to the public. It's one thing to call something open, it's another to make it open. Calling something open is PR. Making something open means you allow your competitors to start working on it and giving up away your early advantage.

I am not contending in any way that Mantle didn't kick off a major development in the PC space. It obviously did. I'm just pointing out the incredible hypocrisy and holier-than-you rhetoric around it.

As you said it: they gave it to Khronos.

The only complaint about Mantle was that AMD would have an unfair advantage because they would control the API. So they gave it to Khronos.

AMD is being schooled for announcing an API would be open and then giving it to a nonprofit industry consortium that develops open APIs?

Is this real?
 
Gotcha, so you'd be absolutely happy in a world where, if both IHV's had equal GPU share in the market you couldn't play half the games released because it ran horrible on your video card.

Even better, what if there were 3 IHVs? Then you'd be limited to only playing 1/3 of available games because the other 2/3 would perform horribly due to closed source sabotaging of performance. It's great that you applaud an IHV for abusing it's monopoly.

First of all I'd say there are in fact 3 major IHVs. But if AMD keeps on doing things the way it has been doing things that sadly might not be the case forever.
Secondly, calling Nvidia a monopoly takes serious liberties with the actual meaning of that word.
And thirdly, conjuring up a hypothetical dysfunctional future and arguing against that is too much of a strawman technique to be your actual concern. Hopefully.

Fact is that there's really nothing preventing you from disabling the fancy hair and still enjoying a great game.
The content and value available to you is easily upwards of 99%. Dot 99. We're talking about optional fur here.

Aren't most PC users familiar by now with settling for compromises like say a lower resolution or less or no AA to keep a particular piece of software working smoothly on their hardware?
I think that flexibility at different price points is a benefit of the platform.

So I'm hearing that 'bang for the buck' is a popular feature of AMD products among its customers (if less so among financial analysts).
But first row seats come at a premium in most venues. The view from row 11 is still great though. And fortunately most people just enjoy the show without worrying about what others paid.

But, at least we're seeing some backlash from software developer's, at least those that aren't tied monetarily to Nvidia. And we should start seeing the death of Gameworks. And good riddance.

And this coming from an Nvidia user that isn't happy with the situation.

Regards,
SB

To be honest I hadn't expected someone who so frequently speaks to us at length about their moral convictions with regards to consumer issues to switch to Nvidia.
I hope you are pleased with your purchase and I have to say that to me this is a very bullish sign as far as market share is concerned!

Still, I'm inclined to disagree with your last paragraph as well. Gameworks looks like a success and a very compelling proposition. Particularly to independent or smaller development studios.
I expect it will be around in some form or another for a long time. As will the predictable but always entertaining gnashing of teeth.
 
As you said it: they gave it to Khronos.

The only complaint about Mantle was that AMD would have an unfair advantage because they would control the API. So they gave it to Khronos.

After it ceased to be relevant. The touted openness never had any practical benefit. It never did anything for Nvidia or Intel customers, ever.
Arguably the opposite: Major studios delivered some seriously flawed releases but still found time to spend effort on a separate rendering code path that ended up delivering a few extra frames per second for a select few users. With some bugs of its own.

And now AMD dumped the support effort onto someone else. Kinda like it did with Linux drivers.
I suppose that's still more impressive than just riding the coattails of someone else's initiative from the start, like say with OpenCL, Havok, Bullet physics and such.
'Open', as in someone else can take care of that for us.

AMD is being schooled for announcing an API would be open and then giving it to a nonprofit industry consortium that develops open APIs?

Is this real?

I don't know about anyone else but I'm feeling your excitement!
 
Microsoft took the ideas of Mantle, yes. How does that make Mantle open? And AMD noticed that Mantle was pointless in the PC windows space, so they gave it to Khronos.

But let's not forget that when Intel asked for Mantle they were denied access. And that they forever stalled releasing Mantle to the public. It's one thing to call something open, it's another to make it open. Calling something open is PR. Making something open means you allow your competitors to start working on it and giving up away your early advantage.

I am not contending in any way that Mantle didn't kick off a major development in the PC space. It obviously did. I'm just pointing out the incredible hypocrisy and holier-than-you rhetoric around it.

They said Mantle would remain closed while in beta, and would be open after that. That's exactly what happened, except that when it got out of beta it was called Vulkan instead.
 
After it ceased to be relevant. The touted openness never had any practical benefit. It never did anything for Nvidia or Intel customers, ever.
Arguably the opposite: Major studios delivered some seriously flawed releases but still found time to spend effort on a separate rendering code path that ended up delivering a few extra frames per second for a select few users. With some bugs of its own.

And now AMD dumped the support effort onto someone else. Kinda like it did with Linux drivers.
I suppose that's still more impressive than just riding the coattails of someone else's initiative from the start, like say with OpenCL, Havok, Bullet physics and such.
'Open', as in someone else can take care of that for us.



I don't know about anyone else but I'm feeling your excitement!

It's baffling. The "touted openness" is the sole reason Vulkan exists as it is.

How can you try to say that AMD "dumped" it? Khronos should sue AMD then!

http://www.ustream.tv/recorded/59559306/theater

Khronos (Vulkan) stuff from GDC
They also go and say it outright, there's tons of companies involved, but they will thank only one by name, AMD, for giving them Mantle to build on
GDC_2015_Vulkan_Thanks_AMD.png



edit:
Slide decks: https://www.khronos.org/developers/library/2015-gdc
 
My biggest problem with this whole hairworks thing is that probably Nvidia cards would benefit from lower tessellation level. It is like Nvidia push the setting to 11 because they knew it would harm AMD more than their hardware. Having seen hairworks with several different tessellation level, 4x should work for low setting and 8x for high + 16x for ultra. Instead Nvidia went for the highest where their card can at least take it better than AMD.
 
Almost as if it was a feature design primarily to harm the competition rather than improve user experience for anyone.
 
My biggest problem with this whole hairworks thing is that probably Nvidia cards would benefit from lower tessellation level. It is like Nvidia push the setting to 11 because they knew it would harm AMD more than their hardware. Having seen hairworks with several different tessellation level, 4x should work for low setting and 8x for high + 16x for ultra. Instead Nvidia went for the highest where their card can at least take it better than AMD.

absolutely love this reference to the guitar amplifications .. set it to 11
 
Food for thoughts:

**Right now, the gaming market is already pretty divided with exclusive tech from the get go.

NVIDIA has GPU PhysX (~40 games, which is used in at least 7 games per year) , CUDA (4 games, which is surprisingly still being used at the rate of one game per year) and finally GameWorks of course.

Intel has Pixel Sync, used in two games.

AMD has Mantle(7 games), now retired.

If the Witcher 3 was released with CUDA or PhysX no one would complain, NVIDIA chose to implement several visual technologies through compute shaders. They locked only TXAA behind closed bars. Now hair simulation is available on all cards from all IHVs albeit with a performance penalty. Which is a much better offer than simply locking it down to one IHV.

** This fuss is unwarranted, Far Cry 4 and COD Ghosts had the same major NVIDIA advantage when hair simulation was enabled. AMD just wanted to create a distraction after a streak of game failures, Almost all major games now feature tech from NV, with AMD completely absent from the scene. Even late allies like EA and SE, are having a U turn moment right now with Battlefield Hardline and Final Fantasy 15. Not to mention the sub bar performance beginning with games like Unity, Advanced Warfare, Dying Light, Project Cars, and now The Witcher 3, you can expect the next Batman, Assassin's Creed, Call Of Duty and Unreal Tournament to fall in line too.

**AMD had a similar situation with GCN favorable shaders in Tomb Raider, Deus Ex HR, Dirt Showdown. NVIDIA doesn't complain, they move on and optimize until sometimes the situation becomes reversed. I expect this will happen again.

**AMD retired Mantle after failing to achieve any of it's company invested goals. It had many bugs that actually worked against some of it's bullet points (like increasing memory footprint), it carried the burden of architecture backward compatibility and offered nothing in terms of actually reducing CPU overhead to a tangible difference or offering any advantage over the competition. Most of it's headlines were made through sabotaging the DX11 path on AMD hardware. it also failed to stay relevant for it to affect image quality. So, Yes Mantle pressured the industry to accelerate the introduction of DX12 (same thing with G.Sync and Variable Refresh Rate), but that's about it when it comes to how Mantle was relevant to the consumer. Same logic can be applied to G.Sync too if it failed to differentiate itself enough.
 
AMD had a similar situation with GCN favorable shaders in Tomb Raider, Deus Ex HR, Dirt Showdown. NVIDIA doesn't complain, they move on and optimize until sometimes the situation becomes reversed. I expect this will happen again.

No, this is not similar, because the developers of those games had access to the shaders and could optimize them for all targeted architectures. NVIDIA had access to them as well, unless deliberate attempts were made to obfuscate them.

**AMD retired Mantle after failing to achieve any of it's company invested goals. It had many bugs that actually worked against some of it's bullet points (like increasing memory footprint), it carried the burden of architecture backward compatibility and offered nothing in terms of actually reducing CPU overhead to a tangible difference or offering any advantage over the competition. Most of it's headlines were made through sabotaging the DX11 path on AMD hardware. it also failed to stay relevant for it to affect image quality. So, Yes Mantle pressured the industry to accelerate the introduction of DX12 (same thing with G.Sync and Variable Refresh Rate), but that's about it when it comes to how Mantle was relevant to the consumer. Same logic can be applied to G.Sync too if it failed to differentiate itself enough.

Almost none of this is correct.
 
Mantle was problematic, but the promise of open source was at least always there.

PhysX has had plenty of complaints, but various reasons have conspired to make GPU physics a mostly toothless differentiator ... generic physics just didn't suit GPUs all that well and the consoles required CPU paths any way. With GPUs becoming better at multitasking and NVIDIA caring less about the console space this will change going forward.

Pixelsync is in DirectX 12 and I haven't heard NVIDIA or AMD complain about patents so presumably it's okay for AMD/NVIDIA to implement it (even with a software solution I don't think AMD/NVIDIA will have much trouble competing).
 
I'm sure they have some cost, but most of these should be one-time costs.
You don't know that. There is more to performance than shader optimizations.
silent_guy said:
Some months ago, somebody posted on a forum somewhere about his experiences as an interns or employee at Nvidia: they seems to spend tons of effort optimizing their drivers for specific games, so you'd expect Nvidia drivers to have tons of overhead as well. The opposite is true. But Nvidia seems to be better at using at least 2 CPUs in their driver. Maybe AMD should have spent more time on that aspect?
And it's not as if there are that many GameWorks with GPU acceleration out there anyway.
It doesn't matter how many there are, what matters is how important they are to reviews and gamers, wouldn't you agree?
silent_guy said:
It should be fairly trivial for a company like AMD to put a snooping layer between a game and the Windows API and dump all the transactions. And if they then discovered that this were the case, we'd hear it loud and clear. Since we didn't hear any whining of that sort, I'm pretty confident that this is not happening. ;)
Windows API? What is that? Do you mean the DDI layer? If so, how would one correlate the DDI calls from one vendor to another given that the driver implementations are completely different?
 
Status
Not open for further replies.
Back
Top